Back to Search Start Over

On the ideal number of groups for isometric gradient propagation.

Authors :
Kim, Bum Jun
Choi, Hyeyeon
Jang, Hyeonah
Kim, Sang Woo
Source :
Neurocomputing. Mar2024, Vol. 573, pN.PAG-N.PAG. 1p.
Publication Year :
2024

Abstract

Recently, various normalization layers have been proposed to stabilize the training of deep neural networks. Among them, group normalization is a generalization of layer normalization and instance normalization by allowing a degree of freedom in the number of groups it uses. However, to determine the optimal number of groups, trial-and-error-based hyperparameter tuning is required, and such experiments are time-consuming. In this study, we discuss a reasonable method for setting the number of groups. First, we find that the number of groups influences the gradient behavior of the group normalization layer. Based on this observation, we derive the ideal number of groups, which calibrates the gradient scale to facilitate gradient descent optimization. This paper is the first to propose an optimal number of groups that is theoretically grounded, architecture-aware, and can provide a proper value in a layer-wise manner for all layers. The proposed method exhibited improved performance over existing methods in numerous neural network architectures, tasks, and datasets. • We propose a method to determine the number of groups of group normalization. • A theoretical analysis of group normalization with activation function is provided. • The proposed method is validated against various practical problems. [ABSTRACT FROM AUTHOR]

Subjects

Subjects :
*ARTIFICIAL neural networks

Details

Language :
English
ISSN :
09252312
Volume :
573
Database :
Academic Search Index
Journal :
Neurocomputing
Publication Type :
Academic Journal
Accession number :
175164779
Full Text :
https://doi.org/10.1016/j.neucom.2023.127217