Back to Search Start Over

A family of variable metric proximal methods.

Authors :
Bonnans, J.
Gilbert, J.
Lemaréchal, C.
Sagastizábal, C.
Source :
Mathematical Programming; Jan1995, Vol. 68 Issue 1-3, p15-47, 33p
Publication Year :
1995

Abstract

We consider conceptual optimization methods combining two ideas: the Moreau-Yosida regularization in convex analysis, and quasi-Newton approximations of smooth functions. We outline several approaches based on this combination, and establish their global convergence. Then we study theoretically the local convergence properties of one of these approaches, which uses quasi-Newton updates of the objective function itself. Also, we obtain a globally and superlinearly convergent BFGS proximal method. At each step of our study, we single out the assumptions that are useful to derive the result concerned. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00255610
Volume :
68
Issue :
1-3
Database :
Complementary Index
Journal :
Mathematical Programming
Publication Type :
Academic Journal
Accession number :
71014516
Full Text :
https://doi.org/10.1007/BF01585756