Back to Search Start Over

Normalizing Flows Across Dimensions

Authors :
Cunningham, Edmond
Zabounidis, Renos
Agrawal, Abhinav
Fiterau, Madalina
Sheldon, Daniel
Publication Year :
2020

Abstract

Real-world data with underlying structure, such as pictures of faces, are hypothesized to lie on a low-dimensional manifold. This manifold hypothesis has motivated state-of-the-art generative algorithms that learn low-dimensional data representations. Unfortunately, a popular generative model, normalizing flows, cannot take advantage of this. Normalizing flows are based on successive variable transformations that are, by design, incapable of learning lower-dimensional representations. In this paper we introduce noisy injective flows (NIF), a generalization of normalizing flows that can go across dimensions. NIF explicitly map the latent space to a learnable manifold in a high-dimensional data space using injective transformations. We further employ an additive noise model to account for deviations from the manifold and identify a stochastic inverse of the generative process. Empirically, we demonstrate that a simple application of our method to existing flow architectures can significantly improve sample quality and yield separable data embeddings.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2006.13070
Document Type :
Working Paper