Back to Search Start Over

ALTERNATING LEAST SQUARES AS MOVING SUBSPACE CORRECTION.

Authors :
OSELEDETS, IVAN V.
RAKHUBA, MAXIM V.
USCHMAJEW, ANDRÉ
Source :
SIAM Journal on Numerical Analysis. 2018, Vol. 56 Issue 6, p3459-3479. 21p.
Publication Year :
2018

Abstract

In this note we take a new look at the local convergence of alternating optimization methods for low-rank matrices and tensors. Our abstract interpretation as sequential optimization on moving subspaces yields insightful reformulations of some known convergence conditions that focus on the interplay between the contractivity of classical multiplicative Schwarz methods with overlapping subspaces and the curvature of low-rank matrix and tensor manifolds. While the verification of the abstract conditions in concrete scenarios remains open in most cases, we are able to provide an alternative and conceptually simple derivation of the asymptotic convergence rate of the twosided block power method of numerical algebra for computing the dominant singular subspaces of a rectangular matrix. This method is equivalent to an alternating least squares method applied to a distance function. The theoretical results are illustrated and validated by numerical experiments [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00361429
Volume :
56
Issue :
6
Database :
Academic Search Index
Journal :
SIAM Journal on Numerical Analysis
Publication Type :
Academic Journal
Accession number :
133714902
Full Text :
https://doi.org/10.1137/17M1148712