1. The loss surfaces of neural networks with general activation functions
- Author
-
Jon P Keating, Francesco Mezzadri, Nicholas P. Baskerville, and Joseph Najnudel
- Subjects
FOS: Computer and information sciences ,Statistics and Probability ,Computer Science - Machine Learning ,Work (thermodynamics) ,Asymptotic analysis ,Spin glass ,Computer science ,FOS: Physical sciences ,01 natural sciences ,Machine Learning (cs.LG) ,0103 physical sciences ,FOS: Mathematics ,Statistical physics ,0101 mathematics ,010306 general physics ,Condensed Matter - Statistical Mechanics ,Mathematical Physics ,Statistical Mechanics (cond-mat.stat-mech) ,Artificial neural network ,business.industry ,Deep learning ,Probability (math.PR) ,010102 general mathematics ,Statistical and Nonlinear Physics ,Mathematical Physics (math-ph) ,Artificial intelligence ,Statistics, Probability and Uncertainty ,business ,Mathematics - Probability - Abstract
The loss surfaces of deep neural networks have been the subject of several studies, theoretical and experimental, over the last few years. One strand of work considers the complexity, in the sense of local optima, of high dimensional random functions with the aim of informing how local optimisation methods may perform in such complicated settings. Prior work of Choromanska et al (2015) established a direct link between the training loss surfaces of deep multi-layer perceptron networks and spherical multi-spin glass models under some very strong assumptions on the network and its data. In this work, we test the validity of this approach by removing the undesirable restriction to ReLU activation functions. In doing so, we chart a new path through the spin glass complexity calculations using supersymmetric methods in Random Matrix Theory which may prove useful in other contexts. Our results shed new light on both the strengths and the weaknesses of spin glass models in this context., 50 pages, 11 figures; references added for Kac-Rice reduction to RMT method; updates following JSTAT review and publication
- Published
- 2021
- Full Text
- View/download PDF