Back to Search Start Over

Language Scaling for Universal Suggested Replies Model

Authors :
Ying, Qianlan
Bajaj, Payal
Deb, Budhaditya
Yang, Yu
Wang, Wei
Lin, Bojia
Shokouhi, Milad
Song, Xia
Yang, Yang
Jiang, Daxin
Ying, Qianlan
Bajaj, Payal
Deb, Budhaditya
Yang, Yu
Wang, Wei
Lin, Bojia
Shokouhi, Milad
Song, Xia
Yang, Yang
Jiang, Daxin
Publication Year :
2021

Abstract

We consider the problem of scaling automated suggested replies for Outlook email system to multiple languages. Faced with increased compute requirements and low resources for language expansion, we build a single universal model for improving the quality and reducing run-time costs of our production system. However, restricted data movement across regional centers prevents joint training across languages. To this end, we propose a multi-task continual learning framework, with auxiliary tasks and language adapters to learn universal language representation across regions. The experimental results show positive cross-lingual transfer across languages while reducing catastrophic forgetting across regions. Our online results on real user traffic show significant gains in CTR and characters saved, as well as 65% training cost reduction compared with per-language models. As a consequence, we have scaled the feature in multiple languages including low-resource markets.

Details

Database :
OAIster
Publication Type :
Electronic Resource
Accession number :
edsoai.on1269555213
Document Type :
Electronic Resource