1. A class of accelerated conjugate-gradient-like methods based on a modified secant equation
- Author
-
Yigui Ou and Haichan Lin
- Subjects
Control and Optimization ,Line search ,Scale (ratio) ,Computer science ,Applied Mathematics ,Strategy and Management ,Computation ,MathematicsofComputing_NUMERICALANALYSIS ,Atomic and Molecular Physics, and Optics ,Convexity ,Conjugate gradient method ,Convergence (routing) ,Applied mathematics ,Business and International Management ,Electrical and Electronic Engineering ,Convex function ,Descent (mathematics) - Abstract
This paper proposes a new class of accelerated conjugate-gradient-like algorithms for solving large scale unconstrained optimization problems, which combine the idea of accelerated adaptive Perry conjugate gradient algorithms proposed by Andrei (2017) with the modified secant condition and the nonmonotone line search technique. An attractive property of the proposed methods is that the search direction always provides sufficient descent step which is independent of the line search used and the convexity of objective function. Under common assumptions, it is proven that the proposed methods possess global convergence for nonconvex smooth functions, and R-linear convergence for uniformly convex functions. The numerical experiments show the efficiency of the proposed method in practical computations.
- Published
- 2020