Back to Search Start Over

ALTERNATING PROXIMAL-GRADIENT STEPS FOR (STOCHASTIC) NONCONVEX-CONCAVE MINIMAX PROBLEMS.

Authors :
BOȚ, RADU IOAN
BÖHM, AXEL
Source :
SIAM Journal on Optimization; 2023, Vol. 33 Issue 3, p1884-1913, 30p
Publication Year :
2023

Abstract

Minimax problems of the form min<subscript>x</subscript> max<subscript>y</subscript> Ѱ(x, y) have attracted increased interest largely due to advances in machine learning, in particular generative adversarial networks and adversarial learning. These are typically trained using variants of stochastic gradient descent for the two players. Although convex-concave problems are well understood with many efficient solution methods to choose from, theoretical guarantees outside of this setting are sometimes lacking even for the simplest algorithms. In particular, this is the case for alternating gradient descent ascent, where the two agents take turns updating their strategies. To partially close this gap in the literature we prove a novel global convergence rate for the stochastic version of this method for finding a critical point of Ѱ(.):= max<subscript>y</subscript> Ѱ (.,y) in a setting which is not convex-concave. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10526234
Volume :
33
Issue :
3
Database :
Complementary Index
Journal :
SIAM Journal on Optimization
Publication Type :
Academic Journal
Accession number :
173676829
Full Text :
https://doi.org/10.1137/21M1465470