1. 基于动态掩蔽注意力机制的事件抽取.
- Author
-
黄细凤
- Subjects
- *
TASKS , *MACHINE learning , *CORPORA , *ARGUMENT , *CLASSIFICATION - Abstract
Event extraction is an important and challenging task in NLP, completing the identification of event triggers and their arguments from the text. For multiple-event extraction tasks with multiple events in a sentence, this paper proposed a model based on a variant of attention mechanism called DyMAN, which could capture richer context representation and keep more valuable information than the normal attention. The experiments demonstrate that the proposed model can achieve stateof- the-art performance on the ACE 2005 corpus. Compared with the previous best model named JRNN, the DyMAN model achieved a 9. 8% improvement in trigger word classification tasks, and a 4. 5% improvement in factor classification tasks. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF