Back to Search Start Over

New computational guarantees for solving convex optimization problems with first order methods, via a function growth condition measure.

Authors :
Freund, Robert M.
Lu, Haihao
Source :
Mathematical Programming; Aug2018, Vol. 170 Issue 2, p445-477, 33p
Publication Year :
2018

Abstract

Motivated by recent work of Renegar (A framework for applying subgradient methods to conic optimization problems, <ext-link>arXiv:1503.02611v2</ext-link>, <xref>2015</xref>) we present new computational methods and associated computational guarantees for solving convex optimization problems using first-order methods. Our problem of interest is the general convex optimization problem f∗=minx∈Qf(x)<inline-graphic></inline-graphic>, where we presume knowledge of a strict lower bound fslb<f∗<inline-graphic></inline-graphic>. [Indeed, fslb<inline-graphic></inline-graphic> is naturally known when optimizing many loss functions in statistics and machine learning (least-squares, logistic loss, exponential loss, total variation loss, etc.) as well as in Renegar’s transformed version of the standard conic optimization problem <ext-link>arXiv:1503.02611v2</ext-link>, <xref>2015</xref>; in all these cases one has fslb=0<f∗<inline-graphic></inline-graphic>.] We introduce a new functional measure called the growth constant G for f(·)<inline-graphic></inline-graphic>, that measures how quickly the level sets of f(·)<inline-graphic></inline-graphic> grow relative to the function value, and that plays a fundamental role in the complexity analysis. When f(·)<inline-graphic></inline-graphic> is non-smooth, we present new computational guarantees for the Subgradient Descent Method and for smoothing methods, that can improve existing computational guarantees in several ways, most notably when the initial iterate x0<inline-graphic></inline-graphic> is far from the optimal solution set. When f(·)<inline-graphic></inline-graphic> is smooth, we present a scheme for periodically restarting the Accelerated Gradient Method that can also improve existing computational guarantees when x0<inline-graphic></inline-graphic> is far from the optimal solution set, and in the presence of added structure we present a scheme using parametrically increased smoothing that further improves the associated computational guarantees. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00255610
Volume :
170
Issue :
2
Database :
Complementary Index
Journal :
Mathematical Programming
Publication Type :
Academic Journal
Accession number :
130550768
Full Text :
https://doi.org/10.1007/s10107-017-1164-1