Back to Search Start Over

One Step Diffusion-based Super-Resolution with Time-Aware Distillation

Authors :
He, Xiao
Tang, Huaao
Tu, Zhijun
Zhang, Junchao
Cheng, Kun
Chen, Hanting
Guo, Yong
Zhu, Mingrui
Wang, Nannan
Gao, Xinbo
Hu, Jie
Publication Year :
2024

Abstract

Diffusion-based image super-resolution (SR) methods have shown promise in reconstructing high-resolution images with fine details from low-resolution counterparts. However, these approaches typically require tens or even hundreds of iterative samplings, resulting in significant latency. Recently, techniques have been devised to enhance the sampling efficiency of diffusion-based SR models via knowledge distillation. Nonetheless, when aligning the knowledge of student and teacher models, these solutions either solely rely on pixel-level loss constraints or neglect the fact that diffusion models prioritize varying levels of information at different time steps. To accomplish effective and efficient image super-resolution, we propose a time-aware diffusion distillation method, named TAD-SR. Specifically, we introduce a novel score distillation strategy to align the data distribution between the outputs of the student and teacher models after minor noise perturbation. This distillation strategy enables the student network to concentrate more on the high-frequency details. Furthermore, to mitigate performance limitations stemming from distillation, we integrate a latent adversarial loss and devise a time-aware discriminator that leverages diffusion priors to effectively distinguish between real images and generated images. Extensive experiments conducted on synthetic and real-world datasets demonstrate that the proposed method achieves comparable or even superior performance compared to both previous state-of-the-art (SOTA) methods and the teacher model in just one sampling step. Codes are available at https://github.com/LearningHx/TAD-SR.<br />Comment: 18 pages

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2408.07476
Document Type :
Working Paper