Back to Search Start Over

A Non-Euclidean Gradient Descent Framework for Non-Convex Matrix Factorization.

Authors :
Hsieh, Ya-Ping
Kao, Yu-Chun
Mahabadi, Rabeeh Karimi
Yurtsever, Alp
Kyrillidis, Anastasios
Cevher, Volkan
Source :
IEEE Transactions on Signal Processing. 11/15/2018, Vol. 66 Issue 22, p5917-5926. 10p.
Publication Year :
2018

Abstract

We study convex optimization problems that feature low-rank matrix solutions. In such scenarios, non-convex methods offer significant advantages over convex methods due to their lower space complexity, as well as practical faster convergence. Under mild assumptions, these methods feature global convergence guarantees. In this paper, we extend the results on this matter by following a different path. We derive a non-Euclidean optimization framework in the non-convex setting that takes nonlinear gradient steps on the factors. Our framework enables the possibility to further exploit the underlying problem structures, such as sparsity or low-rankness on the factorized domain, or better dimensional dependence of the smoothness parameters of the objectives. We prove that the non-Euclidean methods enjoy the same rigorous guarantees as their Euclidean counterparts under appropriate assumptions. Numerical evidence with Fourier ptychography and FastText applications, using real data, shows that our approach can enhance solution quality, as well as convergence speed over the standard non-convex approaches. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
1053587X
Volume :
66
Issue :
22
Database :
Academic Search Index
Journal :
IEEE Transactions on Signal Processing
Publication Type :
Academic Journal
Accession number :
132684040
Full Text :
https://doi.org/10.1109/TSP.2018.2870353