Back to Search Start Over

On the Curse of Memory in Recurrent Neural Networks: Approximation and Optimization Analysis

Authors :
Li, Zhong
Han, Jiequn
E, Weinan
Li, Qianxiao
Publication Year :
2020

Abstract

We study the approximation properties and optimization dynamics of recurrent neural networks (RNNs) when applied to learn input-output relationships in temporal data. We consider the simple but representative setting of using continuous-time linear RNNs to learn from data generated by linear relationships. Mathematically, the latter can be understood as a sequence of linear functionals. We prove a universal approximation theorem of such linear functionals, and characterize the approximation rate and its relation with memory. Moreover, we perform a fine-grained dynamical analysis of training linear RNNs, which further reveal the intricate interactions between memory and learning. A unifying theme uncovered is the non-trivial effect of memory, a notion that can be made precise in our framework, on approximation and optimization: when there is long term memory in the target, it takes a large number of neurons to approximate it. Moreover, the training process will suffer from slow downs. In particular, both of these effects become exponentially more pronounced with memory - a phenomenon we call the "curse of memory". These analyses represent a basic step towards a concrete mathematical understanding of new phenomenon that may arise in learning temporal relationships using recurrent architectures.<br />Comment: Updated to include the condition $\sup_n \| \boldsymbol{x}(n) \|_{\mathcal{X}} \leq 1$ in the definition of regularity, which excludes the trivial case where only the zero functional is regular. Fixed various typos and improved clarity

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2009.07799
Document Type :
Working Paper