Back to Search
Start Over
Temporal-kernel recurrent neural networks.
- Source :
-
Neural networks : the official journal of the International Neural Network Society [Neural Netw] 2010 Mar; Vol. 23 (2), pp. 239-43. Date of Electronic Publication: 2009 Nov 05. - Publication Year :
- 2010
-
Abstract
- A Recurrent Neural Network (RNN) is a powerful connectionist model that can be applied to many challenging sequential problems, including problems that naturally arise in language and speech. However, RNNs are extremely hard to train on problems that have long-term dependencies, where it is necessary to remember events for many timesteps before using them to make a prediction. In this paper we consider the problem of training RNNs to predict sequences that exhibit significant long-term dependencies, focusing on a serial recall task where the RNN needs to remember a sequence of characters for a large number of steps before reconstructing it. We introduce the Temporal-Kernel Recurrent Neural Network (TKRNN), which is a variant of the RNN that can cope with long-term dependencies much more easily than a standard RNN, and show that the TKRNN develops short-term memory that successfully solves the serial recall task by representing the input string with a stable state of its hidden units.<br /> (Copyright 2009 Elsevier Ltd. All rights reserved.)
Details
- Language :
- English
- ISSN :
- 1879-2782
- Volume :
- 23
- Issue :
- 2
- Database :
- MEDLINE
- Journal :
- Neural networks : the official journal of the International Neural Network Society
- Publication Type :
- Academic Journal
- Accession number :
- 19932002
- Full Text :
- https://doi.org/10.1016/j.neunet.2009.10.009