Back to Search Start Over

Emergent Structures and Lifetime Structure Evolution in Artificial Neural Networks

Authors :
Golkar, Siavash
Publication Year :
2019

Abstract

Motivated by the flexibility of biological neural networks whose connectivity structure changes significantly during their lifetime, we introduce the Unstructured Recursive Network (URN) and demonstrate that it can exhibit similar flexibility during training via gradient descent. We show empirically that many of the different neural network structures commonly used in practice today (including fully connected, locally connected and residual networks of different depths and widths) can emerge dynamically from the same URN. These different structures can be derived using gradient descent on a single general loss function where the structure of the data and the relative strengths of various regulator terms determine the structure of the emergent network. We show that this loss function and the regulators arise naturally when considering the symmetries of the network as well as the geometric properties of the input data.<br />Comment: Proceedings of NeurIPS workshop on Real Neurons & Hidden Units. 5 Pages, 6 figures

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.1911.11691
Document Type :
Working Paper