Back to Search Start Over

Statistical mechanics of deep learning beyond the infinite-width limit

Authors :
Ariosto, S.
Pacelli, R.
Pastore, M.
Ginelli, F.
Gherardi, M.
Rotondo, P.
Publication Year :
2022

Abstract

Decades-long literature testifies to the success of statistical mechanics at clarifying fundamental aspects of deep learning. Yet the ultimate goal remains elusive: we lack a complete theoretical framework to predict practically relevant scores, such as the train and test accuracy, from knowledge of the training data. Huge simplifications arise in the infinite-width limit, where the number of units $N_\ell$ in each hidden layer far exceeds the number $P$ of training examples. This idealisation, however, blatantly departs from the reality of deep learning practice, where training sets are larger than the widths of the networks. Here, we show one way to overcome these limitations. The partition function for fully-connected architectures, which encodes information about the trained models, can be computed analytically with the toolset of statistical mechanics. The computation holds in the "thermodynamic limit'' where both $N_\ell$ and $P$ are large and their ratio $\alpha_\ell = P/N_\ell$, which vanishes in the infinite-width limit, is now finite and generic. This advance allows us to obtain (i) a closed formula for the generalisation error associated to a regression task in a one-hidden layer network with finite $\alpha_\ell$; (ii) an expression of the partition function (technically, via an "effective action'') for fully-connected architectures with arbitrary number of hidden layers, in terms of a finite number of degrees of freedom (technically, "order parameters''); (iii) a demonstration that the Gaussian processes arising in the infinite-width limit should be replaced by Student-t processes; (iv) a simple analytical criterion to predict, for a given training set, whether finite-width networks (with ReLU activations) achieve better test accuracy than infinite-width ones.<br />Comment: 15 pages, 4 figures. Comments are welcome

Details

Language :
English
Database :
OpenAIRE
Accession number :
edsair.doi.dedup.....9c6bf7055a365ec525cadaaa5ece75d8