Back to Search
Start Over
Self-attention-based conditional random fields latent variables model for sequence labeling
- Source :
- Pattern Recognition Letters
- Publication Year :
- 2021
- Publisher :
- Elsevier BV, 2021.
-
Abstract
- To process data like text and speech, Natural Language Processing (NLP) is a valuable tool. As on of NLP’s upstream tasks, sequence labeling is a vital part of NLP through techniques like text classification, machine translation, and sentiment analysis. In this paper, our focus is on sequence labeling where we assign semantic labels within input sequences. We present two novel frameworks, namely SA-CRFLV-I and SA-CRFLV-II, that use latent variables within random fields. These frameworks make use of an encoding schema in the form of a latent variable to be able to capture the latent structure in the observed data. SA-CRFLV-I shows the best performance at the sentence level whereas SA-CRFLV-II works best at the word level. In our in-depth experimental results, we compare our frameworks with 4 well-known sequence prediction methodologies which include NER, reference parsing, chunking as well as POS tagging. The proposed frameworks are shown to have better performance in terms of many well-known metrics.
- Subjects :
- Conditional random field
Parsing
Machine translation
business.industry
Computer science
Sentiment analysis
02 engineering and technology
Latent variable
computer.software_genre
01 natural sciences
Sequence labeling
Artificial Intelligence
0103 physical sciences
Signal Processing
Chunking (psychology)
0202 electrical engineering, electronic engineering, information engineering
020201 artificial intelligence & image processing
Computer Vision and Pattern Recognition
Artificial intelligence
010306 general physics
business
computer
Software
Natural language processing
Sentence
Subjects
Details
- ISSN :
- 01678655
- Volume :
- 145
- Database :
- OpenAIRE
- Journal :
- Pattern Recognition Letters
- Accession number :
- edsair.doi.dedup.....60f8c96e060dce5523e71d49b879da56