Back to Search
Start Over
Deep ReLU Programming
- Publication Year :
- 2020
-
Abstract
- Feed-forward ReLU neural networks partition their input domain into finitely many "affine regions" of constant neuron activation pattern and affine behaviour. We analyze their mathematical structure and provide algorithmic primitives for an efficient application of linear programming related techniques for iterative minimization of such non-convex functions. In particular, we propose an extension of the Simplex algorithm which is iterating on induced vertices but, in addition, is able to change its feasible region computationally efficiently to adjacent "affine regions". This way, we obtain the Barrodale-Roberts algorithm for LAD regression as a special case, but also are able to train the first layer of neural networks with L1 training loss decreasing in every step.<br />Comment: 54 pages
- Subjects :
- Mathematics - Optimization and Control
Subjects
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2011.14895
- Document Type :
- Working Paper