Back to Search
Start Over
Bilinear Factor Matrix Norm Minimization for Robust PCA: Algorithms and Applications.
- Source :
- IEEE Transactions on Pattern Analysis & Machine Intelligence; Sep2018, Vol. 40 Issue 9, p2066-2080, 15p
- Publication Year :
- 2018
-
Abstract
- The heavy-tailed distributions of corrupted outliers and singular values of all channels in low-level vision have proven effective priors for many applications such as background modeling, photometric stereo and image alignment. And they can be well modeled by a hyper-Laplacian. However, the use of such distributions generally leads to challenging non-convex, non-smooth and non-Lipschitz problems, and makes existing algorithms very slow for large-scale applications. Together with the analytic solutions to $\ell _{p}$ -norm minimization with two specific values of $p$ , i.e., $p=1/2$ and $p=2/3$ , we propose two novel bilinear factor matrix norm minimization models for robust principal component analysis. We first define the double nuclear norm and Frobenius/nuclear hybrid norm penalties, and then prove that they are in essence the Schatten- $1/2$ and $2/3$ quasi-norms, respectively, which lead to much more tractable and scalable Lipschitz optimization problems. Our experimental analysis shows that both our methods yield more accurate solutions than original Schatten quasi-norm minimization, even when the number of observations is very limited. Finally, we apply our penalties to various low-level vision problems, e.g., text removal, moving object detection, image alignment and inpainting, and show that our methods usually outperform the state-of-the-art methods. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 01628828
- Volume :
- 40
- Issue :
- 9
- Database :
- Complementary Index
- Journal :
- IEEE Transactions on Pattern Analysis & Machine Intelligence
- Publication Type :
- Academic Journal
- Accession number :
- 131092634
- Full Text :
- https://doi.org/10.1109/TPAMI.2017.2748590