Back to Search Start Over

Compressive Sensing via Variational Bayesian Inference under Two Widely Used Priors: Modeling, Comparison and Discussion.

Authors :
Shekaramiz, Mohammad
Moon, Todd K.
Source :
Entropy. Mar2023, Vol. 25 Issue 3, p511. 32p.
Publication Year :
2023

Abstract

Compressive sensing is a sub-Nyquist sampling technique for efficient signal acquisition and reconstruction of sparse or compressible signals. In order to account for the sparsity of the underlying signal of interest, it is common to use sparsifying priors such as Bernoulli–Gaussian-inverse Gamma (BGiG) and Gaussian-inverse Gamma (GiG) priors on the components of the signal. With the introduction of variational Bayesian inference, the sparse Bayesian learning (SBL) methods for solving the inverse problem of compressive sensing have received significant interest as the SBL methods become more efficient in terms of execution time. In this paper, we consider the sparse signal recovery problem using compressive sensing and the variational Bayesian (VB) inference framework. More specifically, we consider two widely used Bayesian models of BGiG and GiG for modeling the underlying sparse signal for this problem. Although these two models have been widely used for sparse recovery problems under various signal structures, the question of which model can outperform the other for sparse signal recovery under no specific structure has yet to be fully addressed under the VB inference setting. Here, we study these two models specifically under VB inference in detail, provide some motivating examples regarding the issues in signal reconstruction that may occur under each model, perform comparisons and provide suggestions on how to improve the performance of each model. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10994300
Volume :
25
Issue :
3
Database :
Academic Search Index
Journal :
Entropy
Publication Type :
Academic Journal
Accession number :
162812669
Full Text :
https://doi.org/10.3390/e25030511