Back to Search
Start Over
Finito: A faster, permutable incremental gradient method for big data problems
- Source :
- 31st International Conference on Machine Learning, ICML 2014
- Publication Year :
- 2014
-
Abstract
- Recent advances in optimization theory have shown that smooth strongly convex finite sums can be minimized faster than by treating them as a black box "batch" problem. In this work we introduce a new method in this class with a theoretical convergence rate four times faster than ex-isting methods, for sums with sufficiently many terms. This method is also amendable to a sampling without replacement scheme that in practice gives further speed-ups. We give empirical results showing state of the art performance.
Details
- Database :
- OAIster
- Journal :
- 31st International Conference on Machine Learning, ICML 2014
- Notes :
- Beijing, China
- Publication Type :
- Electronic Resource
- Accession number :
- edsoai.on1291783307
- Document Type :
- Electronic Resource