Back to Search Start Over

Promoting Exploration in Memory-Augmented Adam using Critical Momenta

Authors :
Malviya, Pranshu
Mordido, Gonçalo
Baratin, Aristide
Harikandeh, Reza Babanezhad
Huang, Jerry
Lacoste-Julien, Simon
Pascanu, Razvan
Chandar, Sarath
Malviya, Pranshu
Mordido, Gonçalo
Baratin, Aristide
Harikandeh, Reza Babanezhad
Huang, Jerry
Lacoste-Julien, Simon
Pascanu, Razvan
Chandar, Sarath
Publication Year :
2023

Abstract

Adaptive gradient-based optimizers, particularly Adam, have left their mark in training large-scale deep learning models. The strength of such optimizers is that they exhibit fast convergence while being more robust to hyperparameter choice. However, they often generalize worse than non-adaptive methods. Recent studies have tied this performance gap to flat minima selection: adaptive methods tend to find solutions in sharper basins of the loss landscape, which in turn hurts generalization. To overcome this issue, we propose a new memory-augmented version of Adam that promotes exploration towards flatter minima by using a buffer of critical momentum terms during training. Intuitively, the use of the buffer makes the optimizer overshoot outside the basin of attraction if it is not wide enough. We empirically show that our method improves the performance of several variants of Adam on standard supervised language modelling and image classification tasks.

Details

Database :
OAIster
Publication Type :
Electronic Resource
Accession number :
edsoai.on1438464087
Document Type :
Electronic Resource