Back to Search Start Over

Distillation of Discrete Diffusion through Dimensional Correlations

Authors :
Hayakawa, Satoshi
Takida, Yuhta
Imaizumi, Masaaki
Wakaki, Hiromi
Mitsufuji, Yuki
Publication Year :
2024

Abstract

Diffusion models have demonstrated exceptional performances in various fields of generative modeling. While they often outperform competitors including VAEs and GANs in sample quality and diversity, they suffer from slow sampling speed due to their iterative nature. Recently, distillation techniques and consistency models are mitigating this issue in continuous domains, but discrete diffusion models have some specific challenges towards faster generation. Most notably, in the current literature, correlations between different dimensions (pixels, locations) are ignored, both by its modeling and loss functions, due to computational limitations. In this paper, we propose "mixture" models in discrete diffusion that are capable of treating dimensional correlations while remaining scalable, and we provide a set of loss functions for distilling the iterations of existing models. Two primary theoretical insights underpin our approach: first, that dimensionally independent models can well approximate the data distribution if they are allowed to conduct many sampling steps, and second, that our loss functions enables mixture models to distill such many-step conventional models into just a few steps by learning the dimensional correlations. We empirically demonstrate that our proposed method for discrete diffusions work in practice, by distilling a continuous-time discrete diffusion model pretrained on the CIFAR-10 dataset.<br />Comment: To be presented at Machine Learning and Compression Workshop @ NeurIPS 2024

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2410.08709
Document Type :
Working Paper