Back to Search
Start Over
TOWARDS LOWER BOUNDS ON THE DEPTH OF RELU NEURAL NETWORKS.
- Source :
-
SIAM Journal on Discrete Mathematics . 2023, Vol. 37 Issue 2, p997-1029. 33p. - Publication Year :
- 2023
-
Abstract
- We contribute to a better understanding of the class of functions that can be represented by a neural network with ReLU activations and a given architecture. Using techniques from mixed-integer optimization, polyhedral theory, and tropical geometry, we provide a mathematical counterbalance to the universal approximation theorems which suggest that a single hidden layer is sufficient for learning any function. In particular, we investigate whether the class of e xactly representable functions s trictly increases by adding more layers (with no restrictions on size). As a by-product of our investigations, we settle an old conjecture about piecewise linear functions by Wang and Sun [IEEE Trans. Inform. Theory, 51 (2005), pp. 4425--4431] in the affirmative. We also present upper bounds on the sizes of neural networks required to represent functions with logarithmic depth. [ABSTRACT FROM AUTHOR]
- Subjects :
- *LOGARITHMIC functions
*LOGICAL prediction
*GEOMETRY
*SELF-expression
Subjects
Details
- Language :
- English
- ISSN :
- 08954801
- Volume :
- 37
- Issue :
- 2
- Database :
- Academic Search Index
- Journal :
- SIAM Journal on Discrete Mathematics
- Publication Type :
- Academic Journal
- Accession number :
- 169719888
- Full Text :
- https://doi.org/10.1137/22M1489332