Back to Search Start Over

MUST: A Multilingual Student-Teacher Learning approach for low-resource speech recognition

Authors :
Farooq, Muhammad Umar
Ahmad, Rehan
Hain, Thomas
Publication Year :
2023

Abstract

Student-teacher learning or knowledge distillation (KD) has been previously used to address data scarcity issue for training of speech recognition (ASR) systems. However, a limitation of KD training is that the student model classes must be a proper or improper subset of the teacher model classes. It prevents distillation from even acoustically similar languages if the character sets are not same. In this work, the aforementioned limitation is addressed by proposing a MUltilingual Student-Teacher (MUST) learning which exploits a posteriors mapping approach. A pre-trained mapping model is used to map posteriors from a teacher language to the student language ASR. These mapped posteriors are used as soft labels for KD learning. Various teacher ensemble schemes are experimented to train an ASR model for low-resource languages. A model trained with MUST learning reduces relative character error rate (CER) up to 9.5% in comparison with a baseline monolingual ASR.<br />Comment: Accepted for IEEE ASRU 2023

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2310.18865
Document Type :
Working Paper