Back to Search
Start Over
Stable and efficient differentiation of tensor network algorithms
- Publication Year :
- 2023
-
Abstract
- Gradient based optimization methods are the established state-of-the-art paradigm to study strongly entangled quantum systems in two dimensions with Projected Entangled Pair States. However, the key ingredient, the gradient itself, has proven challenging to calculate accurately and reliably in the case of a corner transfer matrix (CTM)-based approach. Automatic differentiation (AD), which is the best known tool for calculating the gradient, still suffers some crucial shortcomings. Some of these are known, like the problem of excessive memory usage and the divergences which may arise when differentiating a singular value decomposition (SVD). Importantly, we also find that there is a fundamental inaccuracy in the currently used backpropagation of SVD that had not been noted before. In this paper, we describe all these problems and provide them with compact and easy to implement solutions. We analyse the impact of these changes and find that the last problem -- the use of the correct gradient -- is by far the dominant one and thus should be considered a crucial patch to any AD application that makes use of an SVD for truncation.
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2311.11894
- Document Type :
- Working Paper