Back to Search Start Over

Linear convergence of first order methods for non-strongly convex optimization

Authors :
Ion Necoara
Yurii Nesterov
François Glineur
UCL - SST/ICTM/INMA - Pôle en ingénierie mathématique
UCL - SSH/LIDAM/CORE - Center for operations research and econometrics
Source :
Mathematical Programming, Vol. 175, p. 69-107 (2019)
Publication Year :
2018
Publisher :
Springer Science and Business Media LLC, 2018.

Abstract

The standard assumption for proving linear convergence of first order methods for smooth convex optimization is the strong convexity of the objective function, an assumption which does not hold for many practical applications. In this paper, we derive linear convergence rates of several first order methods for solving smooth non-strongly convex constrained optimization problems, i.e. involving an objective function with a Lipschitz continuous gradient that satisfies some relaxed strong convexity condition. In particular, in the case of smooth constrained convex optimization, we provide several relaxations of the strong convexity conditions and prove that they are sufficient for getting linear convergence for several first order methods such as projected gradient, fast gradient and feasible descent methods. We also provide examples of functional classes that satisfy our proposed relaxations of strong convexity conditions. Finally, we show that the proposed relaxed strong convexity conditions cover important applications ranging from solving linear systems, Linear Programming, and dual formulations of linearly constrained convex problems.<br />Comment: 36 pages, 4 figures

Details

ISSN :
14364646 and 00255610
Volume :
175
Database :
OpenAIRE
Journal :
Mathematical Programming
Accession number :
edsair.doi.dedup.....6854f1f53d908543352743a4572c8d0c
Full Text :
https://doi.org/10.1007/s10107-018-1232-1