Back to Search Start Over

On the Activation Space of ReLU equipped Deep Neural Networks.

Authors :
Chaukair, Mustafa
Schütte, Christof
Sunkara, Vikram
Source :
Procedia Computer Science; 2023, Vol. 222, p624-635, 12p
Publication Year :
2023

Abstract

Modern Deep Neural Networks are getting wider and deeper in their architecture design. However, with an increasing number of parameters the decision mechanisms becomes more opaque. Therefore, there is a need for understanding the structures arising in the hidden layers of deep neural networks. In this work, we present a new mathematical framework for describing the canonical polyhedral decomposition in the input space, and in addition, we introduce the notions of collapsing- and preserving patches, pertinent to understanding the forward map and the activation space they induce. The activation space can be seen as the output of a layer and, in the particular case of ReLU activations, we prove that this output has the structure of a polyhedral complex. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
18770509
Volume :
222
Database :
Supplemental Index
Journal :
Procedia Computer Science
Publication Type :
Academic Journal
Accession number :
171311560
Full Text :
https://doi.org/10.1016/j.procs.2023.08.200