Back to Search Start Over

Fine tuning Nesterov's steepest descent algorithm for differentiable convex programming.

Authors :
Gonzaga, Clóvis
Karas, Elizabeth
Source :
Mathematical Programming. Apr2013, Vol. 138 Issue 1/2, p141-166. 26p. 1 Color Photograph, 1 Black and White Photograph, 1 Chart, 3 Graphs.
Publication Year :
2013

Abstract

We modify the first order algorithm for convex programming described by Nesterov in his book (in Introductory lectures on convex optimization. A basic course. Kluwer, Boston, ). In his algorithms, Nesterov makes explicit use of a Lipschitz constant L for the function gradient, which is either assumed known (Nesterov in Introductory lectures on convex optimization. A basic course. Kluwer, Boston, ), or is estimated by an adaptive procedure (Nesterov ). We eliminate the use of L at the cost of an extra imprecise line search, and obtain an algorithm which keeps the optimal complexity properties and also inherit the global convergence properties of the steepest descent method for general continuously differentiable optimization. Besides this, we develop an adaptive procedure for estimating a strong convexity constant for the function. Numerical tests for a limited set of toy problems show an improvement in performance when compared with the original Nesterov's algorithms. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00255610
Volume :
138
Issue :
1/2
Database :
Academic Search Index
Journal :
Mathematical Programming
Publication Type :
Academic Journal
Accession number :
86213428
Full Text :
https://doi.org/10.1007/s10107-012-0541-z