Back to Search Start Over

Sharp MSE Bounds for Proximal Denoising.

Authors :
Oymak, Samet
Hassibi, Babak
Source :
Foundations of Computational Mathematics. Aug2016, Vol. 16 Issue 4, p965-1029. 65p.
Publication Year :
2016

Abstract

Denoising has to do with estimating a signal $$\mathbf {x}_0$$ from its noisy observations $$\mathbf {y}=\mathbf {x}_0+\mathbf {z}$$ . In this paper, we focus on the 'structured denoising problem,' where the signal $$\mathbf {x}_0$$ possesses a certain structure and $$\mathbf {z}$$ has independent normally distributed entries with mean zero and variance $$\sigma ^2$$ . We employ a structure-inducing convex function $$f(\cdot )$$ and solve $$\min _\mathbf {x}\{\frac{1}{2}\Vert \mathbf {y}-\mathbf {x}\Vert _2^2+\sigma {\lambda }f(\mathbf {x})\}$$ to estimate $$\mathbf {x}_0$$ , for some $$\lambda >0$$ . Common choices for $$f(\cdot )$$ include the $$\ell _1$$ norm for sparse vectors, the $$\ell _1-\ell _2$$ norm for block-sparse signals and the nuclear norm for low-rank matrices. The metric we use to evaluate the performance of an estimate $$\mathbf {x}^*$$ is the normalized mean-squared error $$\text {NMSE}(\sigma )=\frac{{\mathbb {E}}\Vert \mathbf {x}^*-\mathbf {x}_0\Vert _2^2}{\sigma ^2}$$ . We show that NMSE is maximized as $$\sigma \rightarrow 0$$ and we find the exact worst-case NMSE, which has a simple geometric interpretation: the mean-squared distance of a standard normal vector to the $${\lambda }$$ -scaled subdifferential $${\lambda }\partial f(\mathbf {x}_0)$$ . When $${\lambda }$$ is optimally tuned to minimize the worst-case NMSE, our results can be related to the constrained denoising problem $$\min _{f(\mathbf {x})\le f(\mathbf {x}_0)}\{\Vert \mathbf {y}-\mathbf {x}\Vert _2\}$$ . The paper also connects these results to the generalized LASSO problem, in which one solves $$\min _{f(\mathbf {x})\le f(\mathbf {x}_0)}\{\Vert \mathbf {y}-{\mathbf {A}}\mathbf {x}\Vert _2\}$$ to estimate $$\mathbf {x}_0$$ from noisy linear observations $$\mathbf {y}={\mathbf {A}}\mathbf {x}_0+\mathbf {z}$$ . We show that certain properties of the LASSO problem are closely related to the denoising problem. In particular, we characterize the normalized LASSO cost and show that it exhibits a 'phase transition' as a function of number of observations. We also provide an order-optimal bound for the LASSO error in terms of the mean-squared distance. Our results are significant in two ways. First, we find a simple formula for the performance of a general convex estimator. Secondly, we establish a connection between the denoising and linear inverse problems. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
16153375
Volume :
16
Issue :
4
Database :
Academic Search Index
Journal :
Foundations of Computational Mathematics
Publication Type :
Academic Journal
Accession number :
117017946
Full Text :
https://doi.org/10.1007/s10208-015-9278-4