Back to Search
Start Over
Towards Minimal Supervision BERT-Based Grammar Error Correction (Student Abstract)
- Source :
- AAAI
- Publication Year :
- 2020
- Publisher :
- Association for the Advancement of Artificial Intelligence (AAAI), 2020.
-
Abstract
- Current grammatical error correction (GEC) models typically consider the task as sequence generation, which requires large amounts of annotated data and limit the applications in data-limited settings. We try to incorporate contextual information from pre-trained language model to leverage annotation and benefit multilingual scenarios. Results show strong potential of Bidirectional Encoder Representations from Transformers (BERT) in grammatical error correction task.
- Subjects :
- Grammar
Computer science
business.industry
media_common.quotation_subject
General Medicine
Minimal supervision
computer.software_genre
Language model
Artificial intelligence
Error detection and correction
business
Encoder
computer
Natural language processing
Transformer (machine learning model)
media_common
Subjects
Details
- ISSN :
- 23743468 and 21595399
- Volume :
- 34
- Database :
- OpenAIRE
- Journal :
- Proceedings of the AAAI Conference on Artificial Intelligence
- Accession number :
- edsair.doi...........4af1fa6fdb60c5b36c1d83f879708f23