Article (Scientific journals)
Is Small Language Model the Silver Bullet to Low-Resource Languages Machine Translation?
SONG, Yewei; LI, Lujun; LOTHRITZ, Cedric et al.
2025In LoResMT @ EACL
Peer reviewed
 

Files


Full Text
ACL_2025_Proceedings___Machine_Translation (2).pdf
Author postprint (3.75 MB) Creative Commons License - Attribution, Non-Commercial
Download

All documents in ORBilu are protected by a user license.

Send to



Details



Keywords :
Computer Science - Computation and Language
Abstract :
[en] Low-resource languages (LRLs) lack sufficient linguistic resources and are underrepresented in benchmark datasets, resulting in persistently lower translation quality than high-resource languages, especially in privacy-sensitive and resource-limited contexts. Firstly, this study systematically evaluates state-of-the-art smaller Large Language Models in 200 languages using the FLORES-200 benchmark, highlighting persistent deficiencies and disparities in the translation of LRLs. To mitigate these limitations, we investigate knowledge distillation from large pre-trained teacher models to Small Language Models (SLMs) through supervised fine-tuning. The results show substantial improvements; for example, the translation performance of English to Luxembourgish (EN to LB), measured by the LLM-as-a-Judge score, increases from 0.36 to 0.89 in the validation set for Llama-3.2-3B. We further investigate various fine-tuning configurations and tasks to clarify the trade-offs between data scale and training efficiency, verify that the model retains its general capabilities without significant catastrophic forgetting after training, and explore the distillation benefits to other LRLs on SLMs (Khasi, Assamese, and Ukrainian). In general, this work exposes the limitations and fairness issues of current SLMs in LRL translation and systematically explores the potential of using the distillation of knowledge from large to small models, offering practical, empirically grounded recommendations to improve LRL translation systems
Disciplines :
Computer science
Author, co-author :
SONG, Yewei  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > TruX
LI, Lujun  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > SEDAN
LOTHRITZ, Cedric  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust > TruX > Team Tegawendé François d A BISSYANDE
EZZINI, Saad ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust > TruX > Team Jacques KLEIN
SLEEM, Lama  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > SEDAN
GENTILE, Niccolo ;  University of Luxembourg > Faculty of Humanities, Education and Social Sciences > Department of Behavioural and Cognitive Sciences > Team Conchita D AMBROSIO
STATE, Radu  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > SEDAN
BISSYANDE, Tegawendé  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > TruX
KLEIN, Jacques  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > TruX
External co-authors :
yes
Language :
English
Title :
Is Small Language Model the Silver Bullet to Low-Resource Languages Machine Translation?
Publication date :
2025
Journal title :
LoResMT @ EACL
Peer reviewed :
Peer reviewed
Available on ORBilu :
since 05 March 2026

Statistics


Number of views
42 (4 by Unilu)
Number of downloads
63 (4 by Unilu)

Bibliography


Similar publications



Contact ORBilu