Back to Search
Start Over
Advances in Chinese Pre-training Models
- Source :
- Jisuanji kexue, Vol 49, Iss 7, Pp 148-163 (2022)
- Publication Year :
- 2022
- Publisher :
- Editorial office of Computer Science, 2022.
-
Abstract
- In recent years,pre-training models have flourished in the field of natural language processing,aiming at modeling and representing the implicit knowledge of natural language.However,most of the mainstream pre-training models target at the English domain,and the Chinese domain starts relatively late.Given its importance in the natural language processing process,extensive research has been conducted in both academia and industry,and numerous Chinese pre-training models have been proposed.This paper presents a comprehensive review of the research results related to Chinese pre-training models,firstly introducing the basic overview of pre-training models and their development history,then sorting out the two classical models Transformer and BERT that are mainly used in Chinese pre-training models,then proposing a classification method for Chinese pre-training models according to model categories,and summarizes the different evaluation benchmarks in the Chinese domain.Finally,the future development trend of Chinese pre-training models is prospected.It aims to help researchers to gain a more comprehensive understanding of the development of Chinese pre-training models,and then to provide some ideas for the proposal of new models.
Details
- Language :
- Chinese
- ISSN :
- 1002137X
- Volume :
- 49
- Issue :
- 7
- Database :
- Directory of Open Access Journals
- Journal :
- Jisuanji kexue
- Publication Type :
- Academic Journal
- Accession number :
- edsdoj.b3a63f392f6f47418f95dd6dacf2b710
- Document Type :
- article
- Full Text :
- https://doi.org/10.11896/jsjkx.211200018