Back to Search
Start Over
Riemannian Stochastic Variance-Reduced Cubic Regularized Newton Method for Submanifold Optimization
- Publication Year :
- 2020
-
Abstract
- We propose a stochastic variance-reduced cubic regularized Newton algorithm to optimize the finite-sum problem over a Riemannian submanifold of the Euclidean space. The proposed algorithm requires a full gradient and Hessian update at the beginning of each epoch while it performs stochastic variance-reduced updates in the iterations within each epoch. The iteration complexity of $O(\epsilon^{-3/2})$ to obtain an $(\epsilon,\sqrt{\epsilon})$-second-order stationary point, i.e., a point with the Riemannian gradient norm upper bounded by $\epsilon$ and minimum eigenvalue of Riemannian Hessian lower bounded by $-\sqrt{\epsilon}$, is established when the manifold is embedded in the Euclidean space. Furthermore, the paper proposes a computationally more appealing modification of the algorithm which only requires an inexact solution of the cubic regularized Newton subproblem with the same iteration complexity. The proposed algorithm is evaluated and compared with three other Riemannian second-order methods over two numerical studies on estimating the inverse scale matrix of the multivariate t-distribution on the manifold of symmetric positive definite matrices and estimating the parameter of a linear classifier on the Sphere manifold.
- Subjects :
- Mathematics - Optimization and Control
Subjects
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2010.03785
- Document Type :
- Working Paper
- Full Text :
- https://doi.org/10.1007/s10957-022-02137-5