Back to Search Start Over

LR-CNN: Lightweight Row-centric Convolutional Neural Network Training for Memory Reduction

Authors :
Wang, Zhigang
Yang, Hangyu
Wang, Ning
Xu, Chuanfei
Nie, Jie
Wei, Zhiqiang
Gu, Yu
Yu, Ge
Publication Year :
2024

Abstract

In the last decade, Convolutional Neural Network with a multi-layer architecture has advanced rapidly. However, training its complex network is very space-consuming, since a lot of intermediate data are preserved across layers, especially when processing high-dimension inputs with a big batch size. That poses great challenges to the limited memory capacity of current accelerators (e.g., GPUs). Existing efforts mitigate such bottleneck by external auxiliary solutions with additional hardware costs, and internal modifications with potential accuracy penalty. Differently, our analysis reveals that computations intra- and inter-layers exhibit the spatial-temporal weak dependency and even complete independency features. That inspires us to break the traditional layer-by-layer (column) dataflow rule. Now operations are novelly re-organized into rows throughout all convolution layers. This lightweight design allows a majority of intermediate data to be removed without any loss of accuracy. We particularly study the weak dependency between two consecutive rows. For the resulting skewed memory consumption, we give two solutions with different favorite scenarios. Evaluations on two representative networks confirm the effectiveness. We also validate that our middle dataflow optimization can be smoothly embraced by existing works for better memory reduction.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2401.11471
Document Type :
Working Paper