Back to Search Start Over

A Precise Characterization of SGD Stability Using Loss Surface Geometry

Authors :
Dexter, Gregory
Ocejo, Borja
Keerthi, Sathiya
Gupta, Aman
Acharya, Ayan
Khanna, Rajiv
Publication Year :
2024

Abstract

Stochastic Gradient Descent (SGD) stands as a cornerstone optimization algorithm with proven real-world empirical successes but relatively limited theoretical understanding. Recent research has illuminated a key factor contributing to its practical efficacy: the implicit regularization it instigates. Several studies have investigated the linear stability property of SGD in the vicinity of a stationary point as a predictive proxy for sharpness and generalization error in overparameterized neural networks (Wu et al., 2022; Jastrzebski et al., 2019; Cohen et al., 2021). In this paper, we delve deeper into the relationship between linear stability and sharpness. More specifically, we meticulously delineate the necessary and sufficient conditions for linear stability, contingent on hyperparameters of SGD and the sharpness at the optimum. Towards this end, we introduce a novel coherence measure of the loss Hessian that encapsulates pertinent geometric properties of the loss function that are relevant to the linear stability of SGD. It enables us to provide a simplified sufficient condition for identifying linear instability at an optimum. Notably, compared to previous works, our analysis relies on significantly milder assumptions and is applicable for a broader class of loss functions than known before, encompassing not only mean-squared error but also cross-entropy loss.<br />Comment: To appear at ICLR 2024

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2401.12332
Document Type :
Working Paper