Research Article Open Access

Bangla↔English Machine Translation Using Attention-based Multi-Headed Transformer Model

Argha Chandra Dhar1, Arna Roy1, M. A. H. Akhand1, Md Abdus Samad Kamal2 and Nazmul Siddique3
  • 1 Khulna University of Engineering and Technology, Bangladesh
  • 2 Gunma University, Japan
  • 3 Ulster University, United Kingdom

Abstract

Machine Translation (MT) refers to translate texts or documents from the source language into the target language without human intervention. Any MT model is language-dependent and its development requires grammar, phrase rules, vocabulary, or relevant data for the particular language pair. Hitherto, little research on MT for Bangla-English is reported in the literature, although Bangla is a major language. This study presents a deep learning-based MT system concerning both-way translation for the Bangla-English language pair. The attention-based multi-headed transformer model has been considered in this study due to its significant features of parallelism in input processing. A transformer model consisting of encoders and decoders is adapted by tuning different parameters (especially, number of heads) to identify the best performing model for Bangla to English and vice versa. The proposed model is tested on SUPara benchmark Bangla-English corpus and evaluated the Bilingual Evaluation Understudy (BLEU) score, which is currently the most popular evaluation metric in the MT field. The proposed method is revealed as a promising Bangla-English MT system achieving BLEU scores of 21.42 and 25.44 for Bangla to English and English to Bangla MT cases, respectively.

Journal of Computer Science
Volume 17 No. 10, 2021, 1000-1010

DOI: https://doi.org/10.3844/jcssp.2021.1000.1010

Submitted On: 22 June 2021 Published On: 10 November 2021

How to Cite: Dhar, A. C., Roy, A., Akhand, M. A. H., Kamal, M. A. S. & Siddique, N. (2021). Bangla↔English Machine Translation Using Attention-based Multi-Headed Transformer Model. Journal of Computer Science, 17(10), 1000-1010. https://doi.org/10.3844/jcssp.2021.1000.1010

  • 3,794 Views
  • 1,798 Downloads
  • 2 Citations

Download

Keywords

  • Deep Learning
  • Machine Translation
  • Neural Machine Translation
  • Transformer Model