Back to Search Start Over

Non-asymptotic superlinear convergence of standard quasi-Newton methods

Authors :
Qiujiang Jin
Aryan Mokhtari
Source :
Mathematical Programming. 200:425-473
Publication Year :
2022
Publisher :
Springer Science and Business Media LLC, 2022.

Abstract

In this paper, we study and prove the non-asymptotic superlinear convergence rate of the Broyden class of quasi-Newton algorithms which includes the Davidon–Fletcher–Powell (DFP) method and the Broyden–Fletcher–Goldfarb–Shanno (BFGS) method. The asymptotic superlinear convergence rate of these quasi-Newton methods has been extensively studied in the literature, but their explicit finite–time local convergence rate is not fully investigated. In this paper, we provide a finite–time (non-asymptotic) convergence analysis for Broyden quasi-Newton algorithms under the assumptions that the objective function is strongly convex, its gradient is Lipschitz continuous, and its Hessian is Lipschitz continuous at the optimal solution. We show that in a local neighborhood of the optimal solution, the iterates generated by both DFP and BFGS converge to the optimal solution at a superlinear rate of $$(1/k)^{k/2}$$ ( 1 / k ) k / 2 , where k is the number of iterations. We also prove a similar local superlinear convergence result holds for the case that the objective function is self-concordant. Numerical experiments on several datasets confirm our explicit convergence rate bounds. Our theoretical guarantee is one of the first results that provide a non-asymptotic superlinear convergence rate for quasi-Newton methods.

Details

ISSN :
14364646 and 00255610
Volume :
200
Database :
OpenAIRE
Journal :
Mathematical Programming
Accession number :
edsair.doi.dedup.....57e96d87abb95b91d2c526627431108c