1. Shallow parsing with long short-term memory
- Author
-
Hammerton, J, Chen, SH, Cheng, HD, Chiu, DKY, Das, S, Duro, R, Kerre, EE, Leong, HV, Li, Q, Lu, M, Romay, MG, Ventura, D, and Wu, J
- Abstract
Applying Artificial Neural Networks (ANNs) to language learning has been an active area of research in connectionism. However much of this work has involved small and/or artificially created data sets, whilst other approaches to language learning are now routinely applied to large real-world datasets containing hundreds of thousands of words or more, thus raising the question of how ANNs might scale-up. This paper describes recent work on shallow parsing of real world texts using a recurrent neural network(RNN) architecture called Long Short-Term Memory (LSTM)(1).
- Published
- 2003