Back to Search
Start Over
Fast Stochastic Composite Minimization and an Accelerated Frank-Wolfe Algorithm under Parallelization
- Publication Year :
- 2022
- Publisher :
- HAL CCSD, 2022.
-
Abstract
- We consider the problem of minimizing the sum of two convex functions. One of those functions has Lipschitz-continuous gradients, and can be accessed via stochastic oracles, whereas the other is "simple". We provide a Bregman-type algorithm with accelerated convergence in function values to a ball containing the minimum. The radius of this ball depends on problem-dependent constants, including the variance of the stochastic oracle. We further show that this algorithmic setup naturally leads to a variant of Frank-Wolfe achieving acceleration under parallelization. More precisely, when minimizing a smooth convex function on a bounded domain, we show that one can achieve an $\epsilon$ primal-dual gap (in expectation) in $\tilde{O}(1/ \sqrt{\epsilon})$ iterations, by only accessing gradients of the original function and a linear maximization oracle with $O(1/\sqrt{\epsilon})$ computing units in parallel. We illustrate this fast convergence on synthetic numerical experiments.
- Subjects :
- FOS: Computer and information sciences
Computer Science - Machine Learning
[INFO.INFO-LG]Computer Science [cs]/Machine Learning [cs.LG]
Optimization and Control (math.OC)
FOS: Mathematics
[MATH.MATH-OC] Mathematics [math]/Optimization and Control [math.OC]
[INFO.INFO-LG] Computer Science [cs]/Machine Learning [cs.LG]
[MATH.MATH-OC]Mathematics [math]/Optimization and Control [math.OC]
Mathematics - Optimization and Control
Machine Learning (cs.LG)
Subjects
Details
- Language :
- English
- Database :
- OpenAIRE
- Accession number :
- edsair.doi.dedup.....cf002a8ed7859adb3ce1d1343ba975f7