Back to Search
Start Over
A preconditioning proximal Newton method for nondifferentiable convex optimization
- Source :
- Scopus-Elsevier
- Publication Year :
- 1997
-
Abstract
- We propose a proximal Newton method for solving nondifferentiable convex optimization. This method combines the generalized Newton method with Rockafellar's proximal point algorithm. At each step, the proximal point is found approximately and the regularization matrix is preconditioned to overcome inexactness of this approximation. We show that such a preconditioning is possible within some accuracy and the second-order differentiability properties of the Moreau-Yosida regularization are invariant with respect to this preconditioning. Based upon these, superlinear convergence is established under a semismoothness condition..
- Subjects :
- General Mathematics
Numerical analysis
Mathematical analysis
Mathematics::Optimization and Control
Computer Science::Numerical Analysis
Mathematics::Numerical Analysis
Proximal point
symbols.namesake
Newton's method
Nondifferentiable convex optimization
Superlinear convergence
Convex optimization
Proximal gradient methods for learning
symbols
Proximal Gradient Methods
Differentiable function
Invariant (mathematics)
Software
Mathematics
Subjects
Details
- Language :
- English
- Database :
- OpenAIRE
- Journal :
- Scopus-Elsevier
- Accession number :
- edsair.doi.dedup.....b630ae8b9efb93cfa849946efa1b1607