Back to Search Start Over

Expectation-Maximization Gaussian-Mixture Approximate Message Passing.

Authors :
Vila, Jeremy P.
Schniter, Philip
Source :
IEEE Transactions on Signal Processing. Oct2013, Vol. 61 Issue 19, p4658-4672. 15p.
Publication Year :
2013

Abstract

When recovering a sparse signal from noisy compressive linear measurements, the distribution of the signal's non-zero coefficients can have a profound effect on recovery mean-squared error (MSE). If this distribution was a priori known, then one could use computationally efficient approximate message passing (AMP) techniques for nearly minimum MSE (MMSE) recovery. In practice, however, the distribution is unknown, motivating the use of robust algorithms like LASSO—which is nearly minimax optimal—at the cost of significantly larger MSE for non-least-favorable distributions. As an alternative, we propose an empirical-Bayesian technique that simultaneously learns the signal distribution while MMSE-recovering the signal—according to the learned distribution—using AMP. In particular, we model the non-zero distribution as a Gaussian mixture and learn its parameters through expectation maximization, using AMP to implement the expectation step. Numerical experiments on a wide range of signal classes confirm the state-of-the-art performance of our approach, in both reconstruction error and runtime, in the high-dimensional regime, for most (but not all) sensing operators. [ABSTRACT FROM PUBLISHER]

Details

Language :
English
ISSN :
1053587X
Volume :
61
Issue :
19
Database :
Academic Search Index
Journal :
IEEE Transactions on Signal Processing
Publication Type :
Academic Journal
Accession number :
90052891
Full Text :
https://doi.org/10.1109/TSP.2013.2272287