Back to Search Start Over

On a Family of Relaxed Gradient Descent Methods for Quadratic Minimization

Authors :
MacDonald, Liam
Murray, Rua
Tappenden, Rachael
Publication Year :
2024

Abstract

This paper studies the convergence properties of a family of Relaxed $\ell$-Minimal Gradient Descent methods for quadratic optimization; the family includes the omnipresent Steepest Descent method, as well as the Minimal Gradient method. Simple proofs are provided that show, in an appropriately chosen norm, the gradient and the distance of the iterates from optimality converge linearly, for all members of the family. Moreover, the function values decrease linearly, and iteration complexity results are provided. All theoretical results hold when (fixed) relaxation is employed. It is also shown that, given a fixed overhead and storage budget, every Relaxed $\ell$-Minimal Gradient Descent method can be implemented using exactly one matrix vector product. Numerical experiments are presented that illustrate the benefits of relaxation across the family.<br />Comment: 23 pages, 6 figures, 2 tables

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2404.19255
Document Type :
Working Paper