Back to Search Start Over

Continuous low-rank tensor decompositions, with applications to stochastic optimal control and data assimilation

Authors :
Sertac Karaman and Youssef M. Marzouk.
Massachusetts Institute of Technology. Department of Aeronautics and Astronautics.
Gorodetsky, Alex Arkady
Sertac Karaman and Youssef M. Marzouk.
Massachusetts Institute of Technology. Department of Aeronautics and Astronautics.
Gorodetsky, Alex Arkady
Publication Year :
2017

Abstract

Thesis: Ph. D., Massachusetts Institute of Technology, Department of Aeronautics and Astronautics, 2017.<br />Cataloged from PDF version of thesis.<br />Includes bibliographical references (pages 205-214).<br />Optimal decision making under uncertainty is critical for control and optimization of complex systems. However, many techniques for solving problems such as stochastic optimal control and data assimilation encounter the curse of dimensionality when too many state variables are involved. In this thesis, we propose a framework for computing with high-dimensional functions that mitigates this exponential growth in complexity for problems with separable structure. Our framework tightly integrates two emerging areas: tensor decompositions and continuous computation. Tensor decompositions are able to effectively compress and operate with low-rank multidimensional arrays. Continuous computation is a paradigm for computing with functions instead of arrays, and it is best realized by Chebfun, a MATLAB package for computing with functions of up to three dimensions. Continuous computation provides a natural framework for building numerical algorithms that effectively, naturally, and automatically adapt to problem structure. The first part of this thesis describes a compressed continuous computation framework centered around a continuous analogue to the (discrete) tensor-train decomposition called the function-train decomposition. Computation with the function-train requires continuous matrix factorizations and continuous numerical linear algebra. Continuous analogues are presented for performing cross approximation; rounding; multilinear algebra operations such as addition, multiplication, integration, and differentiation; and continuous, rank-revealing, alternating least squares. Advantages of the function-train over the tensor-train include the ability to adaptively approximate functions and the ability to compute with functions that are parameterized differently. For example, while elementwise multiplication between tensors of different sizes is undefined, functions in FT format can be readily multiplied together. Next, we develop compressed versions of value iteration, pol<br />by Alex Arkady Gorodetsky.<br />Ph. D.

Details

Database :
OAIster
Notes :
214 pages, application/pdf, English
Publication Type :
Electronic Resource
Accession number :
edsoai.on1139748034
Document Type :
Electronic Resource