Back to Search
Start Over
The Stochastic Fejér-Monotone Hybrid Steepest Descent Method and the Hierarchical RLS.
- Source :
-
IEEE Transactions on Signal Processing . 6/1/2019, Vol. 67 Issue 11, p2868-2883. 16p. - Publication Year :
- 2019
-
Abstract
- This paper introduces the stochastic Fejér-monotone hybrid steepest descent method (S-FM-HSDM) to solve affinely constrained and composite convex minimization tasks. The minimization task is not known exactly; noise contaminates the information about the composite loss function and the affine constraints. S-FM-HSDM generates sequences of random variables that, under certain conditions and with respect to a probability space, converge point-wise to solutions of the noiseless minimization task. S-FM-HSDM enjoys desirable attributes of optimization techniques such as splitting of variables and constant step size (learning rate). Furthermore, it provides a novel way of exploiting the information about the affine constraints via fixed-point sets of appropriate nonexpansive mappings. Among the offsprings of S-FM-HSDM, the hierarchical recursive least squares (HRLS) takes advantage of S-FM-HSDM's versatility toward affine constraints and offers a novel twist to LS by generating sequences of estimates that converge to solutions of a hierarchical optimization task: minimize a convex loss over the set of minimizers of the ensemble LS loss. Numerical tests on a sparsity-aware LS task show that HRLS compares favorably to several state-of-the-art convex, as well as non-convex, stochastic-approximation, and online-learning counterparts. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 1053587X
- Volume :
- 67
- Issue :
- 11
- Database :
- Academic Search Index
- Journal :
- IEEE Transactions on Signal Processing
- Publication Type :
- Academic Journal
- Accession number :
- 137234218
- Full Text :
- https://doi.org/10.1109/TSP.2019.2907257