1. Study on Chinese Semantic Entity Recognition Method for Cabin Utilizing BERT-BiGRU Model
- Author
-
Ruina Ma, Hui Cao, Zhihao Song, and Xiaoyu Wu
- Subjects
Entity recognition ,BERT-BiGRU ,CRF ,deep learning ,turbine engineering ,Electrical engineering. Electronics. Nuclear engineering ,TK1-9971 - Abstract
Name Entity Recognition (NER) aims to recognize entities in the engine room domain from unstructured engine room domain text. But in the engine room domain, the entities are diverse and complex, and there is a nesting phenomenon, resulting in a low entity recognition rate. In this paper, a deep learning method incorporating language models is proposed to enhance the entity recognition performance within the engine room. domain. Firstly, the Bidirectional Encoder Representation from Transformers (BERT) language model is employed to train text feature extraction, acquiring a matrix of vector representations at the word level. Secondly, the trained word vectors are fed into the Bidirectional Gated Recurrent Unit (BiGRU) for contextual semantic entity feature extraction. Finally, the global optimal sequence is extracted by combining with the Conditional Random Field (CRF) model to obtain the named entities in the ship cabin semantics. The experimental results show that the proposed algorithm can obtain better F1 values for all three types of entity recognition. Compared with BERT-BiGRU, the overall accuracy of entity identification, recall rate and F1 value are improved by 1.35%, 1.45% and 1.40%, respectively.
- Published
- 2024
- Full Text
- View/download PDF