Back to Search
Start Over
ACCURATE LOW-RANK APPROXIMATIONS VIA A FEW ITERATIONS OF ALTERNATING LEAST SQUARES.
- Source :
-
SIAM Journal on Matrix Analysis & Applications . 2017, Vol. 38 Issue 2, p425-433. 9p. - Publication Year :
- 2017
-
Abstract
- A few iterations of alternating least squares with a random starting point provably suffice to produce nearly optimal spectral- and Frobenius-norm accuracies of low-rank approximations to a matrix; iterating to convergence of the matrix entries is unnecessary. Such good accuracy is in fact well known for the low-rank approximations calculated via subspace iterations and other well-known methods that happen to produce mathematically the same low-rank approximations as alternating least squares, at least when starting all the methods with the same appropriately random initializations. Thus, software implementing alternating least squares can be retrofitted via appropriate setting of parameters to calculate nearly optimally accurate low-rank approximations highly efficiently, with no need for convergence of the matrix entries. (Even so, convergence could still be helpful for some applications, say to ensure that the approximations are strongly rank-revealing.) [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 08954798
- Volume :
- 38
- Issue :
- 2
- Database :
- Academic Search Index
- Journal :
- SIAM Journal on Matrix Analysis & Applications
- Publication Type :
- Academic Journal
- Accession number :
- 123852875
- Full Text :
- https://doi.org/10.1137/16M1064556