Back to Search Start Over

Nesterov Meets Optimism: Rate-Optimal Separable Minimax Optimization

Authors :
Li, Chris Junchi
Yuan, Angela
Gidel, Gauthier
Gu, Quanquan
Jordan, Michael I.
Publication Year :
2022

Abstract

We propose a new first-order optimization algorithm -- AcceleratedGradient-OptimisticGradient (AG-OG) Descent Ascent -- for separable convex-concave minimax optimization. The main idea of our algorithm is to carefully leverage the structure of the minimax problem, performing Nesterov acceleration on the individual component and optimistic gradient on the coupling component. Equipped with proper restarting, we show that AG-OG achieves the optimal convergence rate (up to a constant) for a variety of settings, including bilinearly coupled strongly convex-strongly concave minimax optimization (bi-SC-SC), bilinearly coupled convex-strongly concave minimax optimization (bi-C-SC), and bilinear games. We also extend our algorithm to the stochastic setting and achieve the optimal convergence rate in both bi-SC-SC and bi-C-SC settings. AG-OG is the first single-call algorithm with optimal convergence rates in both deterministic and stochastic settings for bilinearly coupled minimax optimization problems.<br />Comment: 44 pages. This version matches the camera-ready that appeared at ICML 2023 under the same title

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2210.17550
Document Type :
Working Paper