1. A new modification of conjugate gradient method with global convergence properties
- Author
-
Ahmad Alhawarat, Mohd Rivaie, Mahmoud Dawahdeh, and Mustafa Mamat
- Subjects
Set (abstract data type) ,Line search ,Computer science ,Property (programming) ,Conjugate gradient method ,Convergence (routing) ,MathematicsofComputing_NUMERICALANALYSIS ,Applied mathematics ,Unconstrained optimization ,Descent (mathematics) - Abstract
The most well-known technique or method in unconstrained optimization is the conjugate gradient method. It is used to get the greatest solution for the unconstrained optimization problems. This method is used in many fields especially, computer science, and engineering due to its convergence speed, simplicity, and the low memory requirements. A new modified conjugate gradient method is presented in this paper. This method is proved with the strong Wolfe-Powell (SWP) line search that it possesses sufficient descent property, and is globally convergent. Numerical results for a set of 141 test problems show the outperformance of the new proposed formula comparing with other methods using the same line search. The performance of this method is more efficient and better than the others.
- Published
- 2018