Back to Search Start Over

A Stochastic Variance Reduced Primal Dual Fixed Point Method for Linearly Constrained Separable Optimization.

Authors :
Ya-Nan Zhu
Xiaoqun Zhang
Source :
SIAM Journal on Imaging Sciences; 2021, Vol. 14 Issue 3, p1326-1353, 28p
Publication Year :
2021

Abstract

In this paper we combine the stochastic variance reduced gradient (SVRG) method [R. Johnson and T. Zhang, in Advances in Neural Information Processing Systems 26, 2013, pp. 315-323] with the primal dual fixed point method (PDFP) proposed in [P. Chen, J. Huang, and X. Zhang, Inverse Problems, 29 (2013)] to minimize a sum of two convex functions, one of which is linearly composite. This type of problems typically arise in sparse signal and image reconstruction. The proposed SVRGPDFP can be seen as a generalization of Prox-SVRG [L. Xiao and T. Zhang, SIAM J. Optim., 24 (2014), pp. 2057-2075] originally designed for the minimization of a sum of two convex functions. Based on some standard assumptions, we propose two variants, one for strongly convex objective functions and the other for the general convex case. Convergence analysis shows that the convergence rate of SVRG-PDFP is Ϭ(1/k) (here k is the iteration number) for the general convex objective function and linear for the strongly convex case. Numerical examples on machine learning and computerized tomography image reconstruction are provided to show the effectiveness of the algorithms. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
19364954
Volume :
14
Issue :
3
Database :
Complementary Index
Journal :
SIAM Journal on Imaging Sciences
Publication Type :
Academic Journal
Accession number :
152945647
Full Text :
https://doi.org/10.1137/20M1354398