Back to Search Start Over

Manifold learning in Wasserstein space

Manifold learning in Wasserstein space

Authors :
Hamm, Keaton
Moosmüller, Caroline
Schmitzer, Bernhard
Thorpe, Matthew
Publication Year :
2023

Abstract

This paper aims at building the theoretical foundations for manifold learning algorithms in the space of absolutely continuous probability measures on a compact and convex subset of $\mathbb{R}^d$, metrized with the Wasserstein-2 distance $\mathrm{W}$. We begin by introducing a construction of submanifolds $\Lambda$ of probability measures equipped with metric $\mathrm{W}_\Lambda$, the geodesic restriction of $W$ to $\Lambda$. In contrast to other constructions, these submanifolds are not necessarily flat, but still allow for local linearizations in a similar fashion to Riemannian submanifolds of $\mathbb{R}^d$. We then show how the latent manifold structure of $(\Lambda,\mathrm{W}_{\Lambda})$ can be learned from samples $\{\lambda_i\}_{i=1}^N$ of $\Lambda$ and pairwise extrinsic Wasserstein distances $\mathrm{W}$ only. In particular, we show that the metric space $(\Lambda,\mathrm{W}_{\Lambda})$ can be asymptotically recovered in the sense of Gromov--Wasserstein from a graph with nodes $\{\lambda_i\}_{i=1}^N$ and edge weights $W(\lambda_i,\lambda_j)$. In addition, we demonstrate how the tangent space at a sample $\lambda$ can be asymptotically recovered via spectral analysis of a suitable "covariance operator" using optimal transport maps from $\lambda$ to sufficiently close and diverse samples $\{\lambda_i\}_{i=1}^N$. The paper closes with some explicit constructions of submanifolds $\Lambda$ and numerical examples on the recovery of tangent spaces through spectral analysis.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2311.08549
Document Type :
Working Paper