Back to Search Start Over

RNNs can generate bounded hierarchical languages with optimal memory

Authors :
Percy Liang
Christopher D. Manning
John Hewitt
Surya Ganguli
Michael Hahn
Source :
EMNLP (1), Scopus-Elsevier
Publication Year :
2020
Publisher :
Association for Computational Linguistics, 2020.

Abstract

Recurrent neural networks empirically generate natural language with high syntactic fidelity. However, their success is not well-understood theoretically. We provide theoretical insight into this success, proving in a finite-precision setting that RNNs can efficiently generate bounded hierarchical languages that reflect the scaffolding of natural language syntax. We introduce Dyck-($k$,$m$), the language of well-nested brackets (of $k$ types) and $m$-bounded nesting depth, reflecting the bounded memory needs and long-distance dependencies of natural language syntax. The best known results use $O(k^{\frac{m}{2}})$ memory (hidden units) to generate these languages. We prove that an RNN with $O(m \log k)$ hidden units suffices, an exponential reduction in memory, by an explicit construction. Finally, we show that no algorithm, even with unbounded computation, can suffice with $o(m \log k)$ hidden units.<br />Comment: EMNLP2020 + appendix typo fixes

Details

Database :
OpenAIRE
Journal :
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Accession number :
edsair.doi.dedup.....f5710b69420dcea64ac23f21dfc97de5