Back to Search Start Over

Extracting General-use Transformers for Low-resource Languages via Knowledge Distillation

Authors :
Cruz, Jan Christian Blaise
Aji, Alham Fikri
Publication Year :
2025

Abstract

In this paper, we propose the use of simple knowledge distillation to produce smaller and more efficient single-language transformers from Massively Multilingual Transformers (MMTs) to alleviate tradeoffs associated with the use of such in low-resource settings. Using Tagalog as a case study, we show that these smaller single-language models perform on-par with strong baselines in a variety of benchmark tasks in a much more efficient manner. Furthermore, we investigate additional steps during the distillation process that improves the soft-supervision of the target language, and provide a number of analyses and ablations to show the efficacy of the proposed method.<br />Comment: LoResLM Workshop @ COLING 2025

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2501.12660
Document Type :
Working Paper