Back to Search
Start Over
MaCow: Masked Convolutional Generative Flow
- Source :
- NeurIPS 2019
- Publication Year :
- 2019
-
Abstract
- Flow-based generative models, conceptually attractive due to tractability of both the exact log-likelihood computation and latent-variable inference, and efficiency of both training and sampling, has led to a number of impressive empirical successes and spawned many advanced variants and theoretical investigations. Despite their computational efficiency, the density estimation performance of flow-based generative models significantly falls behind those of state-of-the-art autoregressive models. In this work, we introduce masked convolutional generative flow (MaCow), a simple yet effective architecture of generative flow using masked convolution. By restricting the local connectivity in a small kernel, MaCow enjoys the properties of fast and stable training, and efficient sampling, while achieving significant improvements over Glow for density estimation on standard image benchmarks, considerably narrowing the gap to autoregressive models.<br />Comment: In Proceedings of Thirty-third Conference on Neural Information Processing Systems (NeurIPS-2019)
Details
- Database :
- arXiv
- Journal :
- NeurIPS 2019
- Publication Type :
- Report
- Accession number :
- edsarx.1902.04208
- Document Type :
- Working Paper