1. Inexact Riemannian Gradient Descent Method for Nonconvex Optimization
- Author
-
Zhou, Juan, Deng, Kangkang, Wang, Hongxia, and Peng, Zheng
- Subjects
Mathematics - Optimization and Control ,65K05, 65K10, 90C05, 90C26, 90C30 - Abstract
Gradient descent methods are fundamental first-order optimization algorithms in both Euclidean spaces and Riemannian manifolds. However, the exact gradient is not readily available in many scenarios. This paper proposes a novel inexact Riemannian gradient descent algorithm for nonconvex problems, accompanied by a convergence guarantee. In particular, we establish two inexact gradient conditions on Riemannian manifolds for the first time, enabling precise gradient approximations. Our method demonstrates strong convergence results for both gradient sequences and function values. The global convergence with constructive convergence rates for the sequence of iterates is ensured under the Riemannian Kurdyka-\L ojasiewicz property. Furthermore, our algorithm encompasses two specific applications: Riemannian sharpness-aware minimization and Riemannian extragradient algorithm, both of which inherit the global convergence properties of the inexact gradient methods. Numerical experiments on low-rank matrix completion and principal component analysis problems validate the efficiency and practical relevance of the proposed approaches., Comment: arXiv admin note: text overlap with arXiv:2401.08060 by other authors
- Published
- 2024