Back to Search
Start Over
Understanding Gating Operations in Recurrent Neural Networks through Opinion Expression Extraction.
- Source :
-
Entropy . Aug2016, Vol. 18 Issue 8, p294. 21p. - Publication Year :
- 2016
-
Abstract
- Extracting opinion expressions from text is an essential task of sentiment analysis, which is usually treated as one of the word-level sequence labeling problems. In such problems, compositional models with multiplicative gating operations provide efficient ways to encode the contexts, as well as to choose critical information. Thus, in this paper, we adopt Long Short-Term Memory (LSTM) recurrent neural networks to address the task of opinion expression extraction and explore the internal mechanisms of the model. The proposed approach is evaluated on the Multi-Perspective Question Answering (MPQA) opinion corpus. The experimental results demonstrate improvement over previous approaches, including the state-of-the-art method based on simple recurrent neural networks. We also provide a novel micro perspective to analyze the run-time processes and gain new insights into the advantages of LSTM selecting the source of information with its flexible connections and multiplicative gating operations. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 10994300
- Volume :
- 18
- Issue :
- 8
- Database :
- Academic Search Index
- Journal :
- Entropy
- Publication Type :
- Academic Journal
- Accession number :
- 117687144
- Full Text :
- https://doi.org/10.3390/e18080294