1. On the Support Recovery of Jointly Sparse Gaussian Sources via Sparse Bayesian Learning.
- Author
-
Khanna, Saurabh and Murthy, Chandra R.
- Subjects
ERROR probability ,CONSTRAINED optimization ,COLUMNS ,MATRIX multiplications ,SPARSE matrices - Abstract
In this work, we provide non-asymptotic, probabilistic guarantees for successful recovery of the common nonzero support of jointly sparse Gaussian sources in the multiple measurement vector (MMV) problem. The support recovery problem is formulated as the marginalized maximum likelihood (or type-II ML) estimation of the variance hyperparameters of a joint sparsity inducing Gaussian prior on the source signals. We derive conditions under which the resulting nonconvex constrained optimization perfectly recovers the nonzero support of a joint-sparse Gaussian source ensemble with arbitrarily high probability. The support error probability decays exponentially with the number of MMVs at a rate that depends on the smallest restricted singular value and the nonnegative null space property of the self Khatri-Rao product of the sensing matrix. Our analysis confirms that nonzero supports of size as high as $O(m^{2})$ are recoverable from $m$ measurements per sparse vector. Our derived sufficient conditions for support consistency of the proposed constrained type-II ML solution also guarantee the support consistency of any global solution of the multiple sparse Bayesian learning (M-SBL) optimization whose nonzero coefficients lie inside a bounded interval. For the case of noiseless measurements, we further show that a single MMV is sufficient for perfect recovery of the $k$ -sparse support by M-SBL, provided all subsets of $k + 1$ columns of the sensing matrix are linearly independent. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF