Back to Search
Start Over
A Primer in BERTology: What we know about how BERT works
- Publication Year :
- 2020
-
Abstract
- Transformer-based models have pushed state of the art in many areas of NLP, but our understanding of what is behind their success is still limited. This paper is the first survey of over 150 studies of the popular BERT model. We review the current state of knowledge about how BERT works, what kind of information it learns and how it is represented, common modifications to its training objectives and architecture, the overparameterization issue and approaches to compression. We then outline directions for future research.<br />Comment: Accepted to TACL. Please note that the multilingual BERT section is only available in version 1
- Subjects :
- Computer Science - Computation and Language
Subjects
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2002.12327
- Document Type :
- Working Paper