Back to Search Start Over

ROBUST ACCELERATED PRIMAL-DUAL METHODS FOR COMPUTING SADDLE POINTS.

Authors :
ZHANG, XUAN
AYBAT, NECDET SERHAT
GURBUZBALABAN, MERT
Source :
SIAM Journal on Optimization; 2024, Vol. 34 Issue 1, p1097-1130, 34p
Publication Year :
2024

Abstract

We consider strongly-convex-strongly-concave saddle point problems assuming we have access to unbiased stochastic estimates of the gradients. We propose a stochastic accelerated primal-dual (SAPD) algorithm and show that the SAPD sequence, generated using constant primal-dual step sizes, linearly converges to a neighborhood of the unique saddle point. Interpreting the size of the neighborhood as a measure of robustness to gradient noise, we obtain explicit characterizations of robustness in terms of SAPD parameters and problem constants. Based on these characterizations, we develop computationally tractable techniques for optimizing the SAPD parameters, i.e., the primal and dual step sizes, and the momentum parameter, to achieve a desired trade-off between the convergence rate and robustness on the Pareto curve. This allows SAPD to enjoy fast convergence properties while being robust to noise as an accelerated method. SAPD admits convergence guarantees for the distance metric with a variance term optimal up to a logarithmic factor, which can be removed by employing a restarting strategy. We also discuss how convergence and robustness results extend to the merely-convex-merely-concave setting. Finally, we illustrate our framework on a distributionally robust logistic regression problem. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10526234
Volume :
34
Issue :
1
Database :
Complementary Index
Journal :
SIAM Journal on Optimization
Publication Type :
Academic Journal
Accession number :
176824633
Full Text :
https://doi.org/10.1137/21M1462775