•  
  •  
 

ASEAN Journal on Science and Technology for Development

Abstract

Natural Language Generation (NLG) plays a crucial role in modern digital tools, including chatbots, virtual support, content suggestions, and tailored marketing, making bots more responsive and reducing the need for human staff. While there's much research on NLG for languages like English, languages like Arabic, Urdu, and Chinese still face challenges. This study examines Arabic NLG's unique aspects, dialects, and word variations. With around 420 million Arabic speakers globally, it's crucial to advance NLG for this language. We compared three models: Long Short-Term Memory (LSTM), a mix of Bidirectional Encoder Representations from Transformers (BERT) and LSTM, and a version that adds Dynamic Skip Connection (DSC). Our aim is to find the best model for predicting Arabic words with the least mistakes. In our experiment, we found that adding DSC wasn't beneficial. However, combining BERT and LSTM with an attention mechanism reduced loss and favorable perplexity values.

Keywords

Arabic, Natural Language Generation (NLG), BERT, Long Short-Term Memory (LSTM), Dynamic Skip Connections (DSC), Deep Learning Architecture, Transformer Models, Arabic Language Modeling, Attention Mechanism, Machine Learning, AI.

Publication Date

2024

Received Date

23-Oct-2023

Revised Date

2-Jan-2024

Accepted Date

28-Feb-2024

Share

COinS