Back to Search Start Over

Modular Growth of Hierarchical Networks: Efficient, General, and Robust Curriculum Learning

Authors :
Hamidi, Mani
Khajehabdollahi, Sina
Giannakakis, Emmanouil
Schäfer, Tim
Levina, Anna
Wu, Charley M.
Publication Year :
2024

Abstract

Structural modularity is a pervasive feature of biological neural networks, which have been linked to several functional and computational advantages. Yet, the use of modular architectures in artificial neural networks has been relatively limited despite early successes. Here, we explore the performance and functional dynamics of a modular network trained on a memory task via an iterative growth curriculum. We find that for a given classical, non-modular recurrent neural network (RNN), an equivalent modular network will perform better across multiple metrics, including training time, generalizability, and robustness to some perturbations. We further examine how different aspects of a modular network's connectivity contribute to its computational capability. We then demonstrate that the inductive bias introduced by the modular topology is strong enough for the network to perform well even when the connectivity within modules is fixed and only the connections between modules are trained. Our findings suggest that gradual modular growth of RNNs could provide advantages for learning increasingly complex tasks on evolutionary timescales, and help build more scalable and compressible artificial networks.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2406.06262
Document Type :
Working Paper