1. Advances in Chinese Pre-training Models
- Author
-
HOU Yu-tao, ABULIZI Abudukelimu, ABUDUKELIMU Halidanmu
- Subjects
chinese pre-training models ,natural language processing ,word embedding ,pre-training ,deep learning ,Computer software ,QA76.75-76.765 ,Technology (General) ,T1-995 - Abstract
In recent years,pre-training models have flourished in the field of natural language processing,aiming at modeling and representing the implicit knowledge of natural language.However,most of the mainstream pre-training models target at the English domain,and the Chinese domain starts relatively late.Given its importance in the natural language processing process,extensive research has been conducted in both academia and industry,and numerous Chinese pre-training models have been proposed.This paper presents a comprehensive review of the research results related to Chinese pre-training models,firstly introducing the basic overview of pre-training models and their development history,then sorting out the two classical models Transformer and BERT that are mainly used in Chinese pre-training models,then proposing a classification method for Chinese pre-training models according to model categories,and summarizes the different evaluation benchmarks in the Chinese domain.Finally,the future development trend of Chinese pre-training models is prospected.It aims to help researchers to gain a more comprehensive understanding of the development of Chinese pre-training models,and then to provide some ideas for the proposal of new models.
- Published
- 2022
- Full Text
- View/download PDF