Back to Search
Start Over
A generative flow for conditional sampling via optimal transport
- Publication Year :
- 2023
-
Abstract
- Sampling conditional distributions is a fundamental task for Bayesian inference and density estimation. Generative models, such as normalizing flows and generative adversarial networks, characterize conditional distributions by learning a transport map that pushes forward a simple reference (e.g., a standard Gaussian) to a target distribution. While these approaches successfully describe many non-Gaussian problems, their performance is often limited by parametric bias and the reliability of gradient-based (adversarial) optimizers to learn these transformations. This work proposes a non-parametric generative model that iteratively maps reference samples to the target. The model uses block-triangular transport maps, whose components are shown to characterize conditionals of the target distribution. These maps arise from solving an optimal transport problem with a weighted $L^2$ cost function, thereby extending the data-driven approach in [Trigila and Tabak, 2016] for conditional sampling. The proposed approach is demonstrated on a two dimensional example and on a parameter inference problem involving nonlinear ODEs.<br />18 pages, 5 figures
- Subjects :
- FOS: Computer and information sciences
Computer Science - Machine Learning
Optimization and Control (math.OC)
Statistics - Machine Learning
FOS: Mathematics
Machine Learning (stat.ML)
Mathematics - Optimization and Control
Statistics - Computation
Computation (stat.CO)
Machine Learning (cs.LG)
Subjects
Details
- Language :
- English
- Database :
- OpenAIRE
- Accession number :
- edsair.doi.dedup.....54e3d69d97cc0fa5307f08bc89e51001