Back to Search Start Over

Better Schedules for Low Precision Training of Deep Neural Networks

Authors :
Wolfe, Cameron R.
Kyrillidis, Anastasios
Source :
Machine Learning (2024): 1-19
Publication Year :
2024

Abstract

Low precision training can significantly reduce the computational overhead of training deep neural networks (DNNs). Though many such techniques exist, cyclic precision training (CPT), which dynamically adjusts precision throughout training according to a cyclic schedule, achieves particularly impressive improvements in training efficiency, while actually improving DNN performance. Existing CPT implementations take common learning rate schedules (e.g., cyclical cosine schedules) and use them for low precision training without adequate comparisons to alternative scheduling options. We define a diverse suite of CPT schedules and analyze their performance across a variety of DNN training regimes, some of which are unexplored in the low precision training literature (e.g., node classification with graph neural networks). From these experiments, we discover alternative CPT schedules that offer further improvements in training efficiency and model performance, as well as derive a set of best practices for choosing CPT schedules. Going further, we find that a correlation exists between model performance and training cost, and that changing the underlying CPT schedule can control the tradeoff between these two variables. To explain the direct correlation between model performance and training cost, we draw a connection between quantized training and critical learning periods, suggesting that aggressive quantization is a form of learning impairment that can permanently damage model performance.<br />Comment: 20 pages, 8 figures, 1 table, ACML 2023

Details

Database :
arXiv
Journal :
Machine Learning (2024): 1-19
Publication Type :
Report
Accession number :
edsarx.2403.02243
Document Type :
Working Paper
Full Text :
https://doi.org/10.1007/s10994-023-06480-0