Back to Search
Start Over
On the Convergence of Constrained Optimization Methods with Accurate Hessian Information on a Subspace
- Source :
- SIAM Journal on Numerical Analysis. 27:141-153
- Publication Year :
- 1990
- Publisher :
- Society for Industrial & Applied Mathematics (SIAM), 1990.
-
Abstract
- This paper analyzes methods of the type proposed by Coleman and Conn for nonlinearly constrained optimization. It is shown that if the reduced Hessian approximation is sufficiently accurate, then the method generates a sequence of iterates that converges one-step superlinearly. This result applies to a quasi-Newton implementation. If the reduced exact Hessian is used, the method has an R-order equal to that of the secant method. A similar result for a modified version of successive quadratic programming is also proved. Finally, some parallels between convergence results for methods that approximate the reduced Hessian method, and multiplier methods that use the reduced Hessian inverse, are pointed out.
- Subjects :
- Hessian matrix
Hessian automatic differentiation
Numerical Analysis
Hessian equation
Mathematical optimization
Davidon–Fletcher–Powell formula
Applied Mathematics
MathematicsofComputing_NUMERICALANALYSIS
Second partial derivative test
Computational Mathematics
symbols.namesake
Secant method
symbols
Quasi-Newton method
Applied mathematics
Quadratic programming
Mathematics
Subjects
Details
- ISSN :
- 10957170 and 00361429
- Volume :
- 27
- Database :
- OpenAIRE
- Journal :
- SIAM Journal on Numerical Analysis
- Accession number :
- edsair.doi...........2382fdc6a2db65688c541c5abbea220e
- Full Text :
- https://doi.org/10.1137/0727009