Back to Search Start Over

Nesterov Acceleration for Riemannian Optimization

Authors :
Kim, Jungbin
Yang, Insoon
Publication Year :
2022

Abstract

In this paper, we generalize the Nesterov accelerated gradient (NAG) method to solve Riemannian optimization problems in a computationally tractable manner. The iteration complexity of our algorithm matches that of the NAG method on the Euclidean space when the objective functions are geodesically convex or geodesically strongly convex. To the best of our knowledge, the proposed algorithm is the first fully accelerated method for geodesically convex optimization problems without requiring strong convexity. Our convergence rate analysis exploits novel metric distortion lemmas as well as carefully designed potential functions. We also identify a connection with the continuous-time dynamics for modeling Riemannian acceleration in Alimisis et al. [1] to understand the accelerated convergence of our scheme through the lens of continuous-time flows.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2202.02036
Document Type :
Working Paper