Back to Search Start Over

An Edit Friendly DDPM Noise Space: Inversion and Manipulations

Authors :
Huberman-Spiegelglas, Inbar
Kulikov, Vladimir
Michaeli, Tomer
Publication Year :
2023

Abstract

Denoising diffusion probabilistic models (DDPMs) employ a sequence of white Gaussian noise samples to generate an image. In analogy with GANs, those noise maps could be considered as the latent code associated with the generated image. However, this native noise space does not possess a convenient structure, and is thus challenging to work with in editing tasks. Here, we propose an alternative latent noise space for DDPM that enables a wide range of editing operations via simple means, and present an inversion method for extracting these edit-friendly noise maps for any given image (real or synthetically generated). As opposed to the native DDPM noise space, the edit-friendly noise maps do not have a standard normal distribution and are not statistically independent across timesteps. However, they allow perfect reconstruction of any desired image, and simple transformations on them translate into meaningful manipulations of the output image (e.g. shifting, color edits). Moreover, in text-conditional models, fixing those noise maps while changing the text prompt, modifies semantics while retaining structure. We illustrate how this property enables text-based editing of real images via the diverse DDPM sampling scheme (in contrast to the popular non-diverse DDIM inversion). We also show how it can be used within existing diffusion-based editing methods to improve their quality and diversity. Webpage: https://inbarhub.github.io/DDPM_inversion<br />Comment: CVPR 2024. Code and examples are available at https://github.com/inbarhub/DDPM_inversion

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2304.06140
Document Type :
Working Paper