Back to Search Start Over

Fast Diffusion Probabilistic Model Sampling through the lens of Backward Error Analysis

Authors :
Gao, Yansong
Pan, Zhihong
Zhou, Xin
Kang, Le
Chaudhari, Pratik
Publication Year :
2023

Abstract

Denoising diffusion probabilistic models (DDPMs) are a class of powerful generative models. The past few years have witnessed the great success of DDPMs in generating high-fidelity samples. A significant limitation of the DDPMs is the slow sampling procedure. DDPMs generally need hundreds or thousands of sequential function evaluations (steps) of neural networks to generate a sample. This paper aims to develop a fast sampling method for DDPMs requiring much fewer steps while retaining high sample quality. The inference process of DDPMs approximates solving the corresponding diffusion ordinary differential equations (diffusion ODEs) in the continuous limit. This work analyzes how the backward error affects the diffusion ODEs and the sample quality in DDPMs. We propose fast sampling through the \textbf{Restricting Backward Error schedule (RBE schedule)} based on dynamically moderating the long-time backward error. Our method accelerates DDPMs without any further training. Our experiments show that sampling with an RBE schedule generates high-quality samples within only 8 to 20 function evaluations on various benchmark datasets. We achieved 12.01 FID in 8 function evaluations on the ImageNet $128\times128$, and a $20\times$ speedup compared with previous baseline samplers.<br />Comment: arXiv admin note: text overlap with arXiv:2101.12176 by other authors

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2304.11446
Document Type :
Working Paper