Back to Search Start Over

Batch Layer Normalization, A new normalization layer for CNNs and RNN

Authors :
Ziaee, Amir
Çano, Erion
Publication Year :
2022

Abstract

This study introduces a new normalization layer termed Batch Layer Normalization (BLN) to reduce the problem of internal covariate shift in deep neural network layers. As a combined version of batch and layer normalization, BLN adaptively puts appropriate weight on mini-batch and feature normalization based on the inverse size of mini-batches to normalize the input to a layer during the learning process. It also performs the exact computation with a minor change at inference times, using either mini-batch statistics or population statistics. The decision process to either use statistics of mini-batch or population gives BLN the ability to play a comprehensive role in the hyper-parameter optimization process of models. The key advantage of BLN is the support of the theoretical analysis of being independent of the input data, and its statistical configuration heavily depends on the task performed, the amount of training data, and the size of batches. Test results indicate the application potential of BLN and its faster convergence than batch normalization and layer normalization in both Convolutional and Recurrent Neural Networks. The code of the experiments is publicly available online (https://github.com/A2Amir/Batch-Layer-Normalization).<br />Comment: Published in proceedings of the 6th international conference on Advances in Artificial Intelligence, ICAAI 2022, Birmingham, UK

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2209.08898
Document Type :
Working Paper
Full Text :
https://doi.org/10.1145/3571560.3571566