Back to Search Start Over

Analysis of a Two-Step Gradient Method with Two Momentum Parameters for Strongly Convex Unconstrained Optimization.

Authors :
Krivovichev, Gerasim V.
Sergeeva, Valentina Yu.
Source :
Algorithms. Mar2024, Vol. 17 Issue 3, p126. 21p.
Publication Year :
2024

Abstract

The paper is devoted to the theoretical and numerical analysis of the two-step method, constructed as a modification of Polyak's heavy ball method with the inclusion of an additional momentum parameter. For the quadratic case, the convergence conditions are obtained with the use of the first Lyapunov method. For the non-quadratic case, sufficiently smooth strongly convex functions are obtained, and these conditions guarantee local convergence.An approach to finding optimal parameter values based on the solution of a constrained optimization problem is proposed. The effect of an additional parameter on the convergence rate is analyzed. With the use of an ordinary differential equation, equivalent to the method, the damping effect of this parameter on the oscillations, which is typical for the non-monotonic convergence of the heavy ball method, is demonstrated. In different numerical examples for non-quadratic convex and non-convex test functions and machine learning problems (regularized smoothed elastic net regression, logistic regression, and recurrent neural network training), the positive influence of an additional parameter value on the convergence process is demonstrated. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
19994893
Volume :
17
Issue :
3
Database :
Academic Search Index
Journal :
Algorithms
Publication Type :
Academic Journal
Accession number :
176272528
Full Text :
https://doi.org/10.3390/a17030126