Back to Search Start Over

AN ACCELERATED HYBRID PROXIMAL EXTRAGRADIENT METHOD FOR CONVEX OPTIMIZATION AND ITS IMPLICATIONS TO SECOND-ORDER METHODS.

Authors :
MONTEIRO, RENATO D. C.
SVAITER, B. F.
Source :
SIAM Journal on Optimization. 2013, Vol. 23 Issue 2, p1092-1125. 34p.
Publication Year :
2013

Abstract

This paper presents an accelerated variant of the hybrid proximal extragradient (HPE) method for convex optimization, referred to as the accelerated HPE (A-HPE) framework. Iteration-complexity results are established for the A-HPE framework, as well as a special version of it, where a large stepsize condition is imposed. Two specific implementations of the A-HPE framework are described in the context of a structured convex optimization problem whose objective function consists of the sum of a smooth convex function and an extended real-valued nonsmooth convex function. In the first implementation, a generalization of a variant of Nesterov's method is obtained for the case where the smooth component of the objective function has Lipschitz continuous gradient. In the second implementation, an accelerated Newton proximal extragradient (A-NPE) method is obtained for the case where the smooth component of the objective function has Lipschitz continuous Hessian. It is shown that the A-NPE method has a O(1/k7/2) convergence rate, which improves upon the O(1/k³) convergence rate bound for another accelerated Newton-type method presented by Nesterov. Finally, while Nesterov's method is based on exact solutions of subproblems with cubic regularization terms, the A-NPE method is based on inexact solutions of subproblems with quadratic regularization terms and hence is potentially more tractable from a computational point of view. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10526234
Volume :
23
Issue :
2
Database :
Academic Search Index
Journal :
SIAM Journal on Optimization
Publication Type :
Academic Journal
Accession number :
90022997
Full Text :
https://doi.org/10.1137/110833786