Back to Search
Start Over
Gradient Methods for Stochastic Optimization in Relative Scale
- Publication Year :
- 2023
-
Abstract
- We propose a new concept of a relatively inexact stochastic subgradient and present novel first-order methods that can use such objects to approximately solve convex optimization problems in relative scale. An important example where relatively inexact subgradients naturally arise is given by the Power or Lanczos algorithms for computing an approximate leading eigenvector of a symmetric positive semidefinite matrix. Using these algorithms as subroutines in our methods, we get new optimization schemes that can provably solve certain large-scale Semidefinite Programming problems with relative accuracy guarantees by using only matrix-vector products.
- Subjects :
- Mathematics - Optimization and Control
Subjects
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2301.08352
- Document Type :
- Working Paper