1. Improved GPT2 Event Extraction Method Based on Mixed Attention Collaborative Layer Vector
- Author
-
Ruchao Jia, Zhenling Zhang, Yangli Jia, Maria Papadopoulou, and Christophe Roche
- Subjects
Transformer ,GPT2 ,mixed attention ,layer vector ,event extraction ,Electrical engineering. Electronics. Nuclear engineering ,TK1-9971 - Abstract
As internet information expands rapidly, extracting valuable event information from unstructured text has become an important research topic. This paper proposes an improved GPT2 model, termed HACLV-GPT2, which is the initial utilization of a GPT-like architecture for the purpose of event extraction. The model utilizes a generative input template and incorporates a hybrid attention mechanism to enhance the understanding of complex contexts. Additionally, the HACLV-GPT2 model employs a layer-vector fusion strategy to optimize the output of Transformer Blocks, effectively boosting prediction performance. The experimental results show that the HACLV-GPT2 model performs excellently in both event argument extraction and event type detection tasks, with F1 values of 0.8020 and 0.9614, respectively, surpassing several baseline models. This outcome fully validates the effectiveness and superiority of the proposed method. Furthermore, ablation experiments confirm the critical role of the hybrid attention mechanism and layer-vector fusion strategy in performance improvement.
- Published
- 2024
- Full Text
- View/download PDF