Back to Search
Start Over
Provable Algorithms for Scalable and Robust Low-Rank Matrix Recovery
- Publication Year :
- 2018
-
Abstract
- Low-rank models are ubiquitous in a wide range of practical applications, and low-rank matrix sensing and recovery has become a problem of great importance. Via leveraging the low-rank structure in data representations, it is possible to faithfully recover the matrix of interest from incomplete observations, in both statistically and computationally efficient manners. This dissertation investigates the fundamental problem of low-rank matrix recovery in different random measurement models, possibly with corruptions.We first consider recovering low-rank positive semidefinite (PSD) matrices from random rank-one measurements, which spans numerous applications including covariance sketching, phase retrieval, quantum state tomography, and learning shallow polynomial neural networks, among others. Our approach is to directly estimate the low-rank factor by minimizing a nonconvex least-squares loss function via vanilla gradient descent, following a tailored spectral initialization. When the true rank is small, this algorithm is guaranteed to converge to the ground truth (up to global ambiguity) with near-optimal sample and computational complexities with respect to the problem size. To the best of our knowledge, this is the first guarantee that achieves near-optimality in both metrics, without the need of sample splitting.When the rank-one measurements are possibly corrupted by arbitrary outliers, we propose a convex optimization algorithm that seeks the PSD matrix with the minimum l1-norm of the observation residual. The advantage of our algorithm is that it is free of parameters, therefore eliminating the need of tuning and allowing easy implementations. We establish that with high probability, a low-rank PSD matrix can be exactly recovered as soon as the number of measurements is large enough, even when a fraction of the measurements are corrupted by outliers with arbitrary magnitudes. Moreover, the recovery is also stable against bounded noise. With the additional information of an upper bound of the rank of the PSD matrix, we then propose another nonconvex algorithm based on subgradient descent that exhibits excellent empirical performance in terms of computational efficiency and accuracy.Moreover, we work on the recovery of generic low-rank matrices from random full-rank linear measurements in the presence of outliers, where we employ a median truncation strategy in gradient descent to improve the robustness of recovery procedure against outliers. We demonstrate that, when initialized in a basin of attraction close to the ground truth, the proposed algorithm converges to the ground truth at a linear rate for the Gaussian measurement model with a near-optimal number of measurements, even when a constant fraction of the measurements are arbitrarily corrupted. In addition, we propose a new truncated spectral method that ensures a valid initialization in the basin of attraction at slightly higher requirements.
Details
- Language :
- English
- Database :
- OpenDissertations
- Publication Type :
- Dissertation/ Thesis
- Accession number :
- ddu.oai.etd.ohiolink.edu.osu1531755516962306