Back to Search
Start Over
Echo state Gaussian process.
- Source :
-
IEEE transactions on neural networks [IEEE Trans Neural Netw] 2011 Sep; Vol. 22 (9), pp. 1435-45. Date of Electronic Publication: 2011 Jul 29. - Publication Year :
- 2011
-
Abstract
- Echo state networks (ESNs) constitute a novel approach to recurrent neural network (RNN) training, with an RNN (the reservoir) being generated randomly, and only a readout being trained using a simple computationally efficient algorithm. ESNs have greatly facilitated the practical application of RNNs, outperforming classical approaches on a number of benchmark tasks. In this paper, we introduce a novel Bayesian approach toward ESNs, the echo state Gaussian process (ESGP). The ESGP combines the merits of ESNs and Gaussian processes to provide a more robust alternative to conventional reservoir computing networks while also offering a measure of confidence on the generated predictions (in the form of a predictive distribution). We exhibit the merits of our approach in a number of applications, considering both benchmark datasets and real-world applications, where we show that our method offers a significant enhancement in the dynamical data modeling capabilities of ESNs. Additionally, we also show that our method is orders of magnitude more computationally efficient compared to existing Gaussian process-based methods for dynamical data modeling, without compromises in the obtained predictive performance.
Details
- Language :
- English
- ISSN :
- 1941-0093
- Volume :
- 22
- Issue :
- 9
- Database :
- MEDLINE
- Journal :
- IEEE transactions on neural networks
- Publication Type :
- Academic Journal
- Accession number :
- 21803684
- Full Text :
- https://doi.org/10.1109/TNN.2011.2162109