Back to Search Start Over

Expressive power of first-order recurrent neural networks determined by their attractor dynamics

Authors :
Alessandro E. P. Villa
Jérémie Cabessa
Source :
Journal of Computer and System Sciences. 82:1232-1250
Publication Year :
2016
Publisher :
Elsevier BV, 2016.

Abstract

We characterize the attractor-based expressive power of several models of recurrent neural networks.The deterministic rational-weighted networks are Muller Turing equivalent.The deterministic real-weighted and evolving networks recognize the class of B C ( ź 2 0 ) neural ω languages.The nondeterministic rational and real networks recognize the class of Σ 1 1 neural ω-languages. We provide a characterization of the expressive powers of several models of deterministic and nondeterministic first-order recurrent neural networks according to their attractor dynamics. The expressive power of neural nets is expressed as the topological complexity of their underlying neural ω-languages, and refers to the ability of the networks to perform more or less complicated classification tasks via the manifestation of specific attractor dynamics. In this context, we prove that most neural models under consideration are strictly more powerful than Muller Turing machines. These results provide new insights into the computational capabilities of recurrent neural networks.

Details

ISSN :
00220000
Volume :
82
Database :
OpenAIRE
Journal :
Journal of Computer and System Sciences
Accession number :
edsair.doi...........a046a3eb58d64728d21b8505bbb3d04f