1. Incorporating Common Knowledge and Specific Entity Linking Knowledge for Machine Reading Comprehension
- Author
-
Yiwei Shan, Neng Gao, Xiaobo Guo, and Shoukang Han
- Subjects
Coreference ,Mechanism (biology) ,Computer science ,business.industry ,Resolution (logic) ,computer.software_genre ,Comprehension ,Entity linking ,Knowledge base ,Common knowledge ,Graph (abstract data type) ,Artificial intelligence ,business ,computer ,Natural language processing - Abstract
Machine comprehension of texts often requires external common knowledge and coreference resolution in the passage. However, most current machine reading comprehension models only incorporate external common knowledge. We propose CoSp model, which incorporates both common knowledge and specific entity linking knowledge for machine reading comprehension. It employs an attention mechanism to adaptively select relevant commonsense and lexical common knowledge from knowledge bases, then it leverages the relational-GCN for reasoning on the entity graph, which is constructed by the entity coreference and co-occurrence for each passage. Hence we obtain knowledge-aware and coreference-aware contextual word representation for answer prediction. Experimental results indicate that CoSp model offers significant and consistent improvements over BERT, outperforming competitive knowledge-aware models on ReCoRD and SQuAD1.1 benchmarks.
- Published
- 2021
- Full Text
- View/download PDF