Back to Search Start Over

Mengzi: Towards Lightweight yet Ingenious Pre-trained Models for Chinese

Authors :
Zhang, Zhuosheng
Zhang, Hanqing
Chen, Keming
Guo, Yuhang
Hua, Jingyun
Wang, Yulong
Zhou, Ming
Zhang, Zhuosheng
Zhang, Hanqing
Chen, Keming
Guo, Yuhang
Hua, Jingyun
Wang, Yulong
Zhou, Ming
Publication Year :
2021

Abstract

Although pre-trained models (PLMs) have achieved remarkable improvements in a wide range of NLP tasks, they are expensive in terms of time and resources. This calls for the study of training more efficient models with less computation but still ensures impressive performance. Instead of pursuing a larger scale, we are committed to developing lightweight yet more powerful models trained with equal or less computation and friendly to rapid deployment. This technical report releases our pre-trained model called Mengzi, which stands for a family of discriminative, generative, domain-specific, and multimodal pre-trained model variants, capable of a wide range of language and vision tasks. Compared with public Chinese PLMs, Mengzi is simple but more powerful. Our lightweight model has achieved new state-of-the-art results on the widely-used CLUE benchmark with our optimized pre-training and fine-tuning techniques. Without modifying the model architecture, our model can be easily employed as an alternative to existing PLMs. Our sources are available at https://github.com/Langboat/Mengzi.

Details

Database :
OAIster
Publication Type :
Electronic Resource
Accession number :
edsoai.on1333724338
Document Type :
Electronic Resource