Back to Search
Start Over
Residual Echo State Networks: Residual recurrent neural networks with stable dynamics and fast learning.
- Source :
-
Neurocomputing . Sep2024, Vol. 597, pN.PAG-N.PAG. 1p. - Publication Year :
- 2024
-
Abstract
- Residual connections have been established as a staple for modern deep learning architectures. Most of their applications are cast towards feedforward computing. In this paper, we study the architectural bias of residual connections in the context of recurrent neural networks (RNNs), specifically in the temporal dimension. We frame our discussion from the perspective of Reservoir Computing and dynamical system theory, focusing on important aspects of neural computation like memory capacity, long-term information processing, stability, and nonlinear computation capability. Experiments corroborate the striking advantage brought by temporal residual connections for a plethora of different time series processing tasks, comprehending memory-based, forecasting, and classification problems. • We study residual (skip) connections in the context of RNNs in the temporal dimension. • Temporal residual connections enable long-term processing of time series. • Orthogonal skip connections allow to drive RNNs to the edge-of-stability in an approximate dynamical isometry regime. • Experiments on memory, forecasting, and classification tasks show the benefits of temporal residual connections in RNNs. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 09252312
- Volume :
- 597
- Database :
- Academic Search Index
- Journal :
- Neurocomputing
- Publication Type :
- Academic Journal
- Accession number :
- 178642995
- Full Text :
- https://doi.org/10.1016/j.neucom.2024.127966