Sign in

Enhancing Low-Resource NMT with a Multilingual Encoder and Knowledge Distillation: A Case Study

By Aniruddha Roy and others
Neural Machine Translation (NMT) remains a formidable challenge, especially when dealing with low-resource languages. Pre-trained sequence-to-sequence (seq2seq) multi-lingual models, such as mBART-50, have demonstrated impressive performance in various low-resource NMT tasks. However, their pre-training has been confined to 50 languages, leaving out support for numerous low-resource languages, particularly those spoken... Show more
July 9, 2024
=
0
Loading PDF…
Loading full text...
Similar articles
Loading recommendations...
=
0
x1
Enhancing Low-Resource NMT with a Multilingual Encoder and Knowledge Distillation: A Case Study
Click on play to start listening