Back to Search Start Over

Adaptive Knowledge Distillation for High-Quality Unsupervised MRI Reconstruction With Model-Driven Priors.

Authors :
Wu Z
Li X
Source :
IEEE journal of biomedical and health informatics [IEEE J Biomed Health Inform] 2024 Jun; Vol. 28 (6), pp. 3571-3582. Date of Electronic Publication: 2024 Jun 06.
Publication Year :
2024

Abstract

Magnetic Resonance Imaging (MRI) reconstruction has made significant progress with the introduction of Deep Learning (DL) technology combined with Compressed Sensing (CS). However, most existing methods require large fully sampled training datasets to supervise the training process, which may be unavailable in many applications. Current unsupervised models also show limitations in performance or speed and may face unaligned distributions during testing. This paper proposes an unsupervised method to train competitive reconstruction models that can generate high-quality samples in an end-to-end style. Firstly teacher models are trained by filling the re-undersampled images and compared with the undersampled images in a self-supervised manner. The teacher models are then distilled to train another cascade model that can leverage the entire undersampled k-space during its training and testing. Additionally, we propose an adaptive distillation method to re-weight the samples based on the variance of teachers, which represents the confidence of the reconstruction results, to improve the quality of distillation. Experimental results on multiple datasets demonstrate that our method significantly accelerates the inference process while preserving or even improving the performance compared to the teacher model. In our tests, the distilled models show 5%-10% improvements in PSNR and SSIM compared with no distillation and are 10 times faster than the teacher.

Details

Language :
English
ISSN :
2168-2208
Volume :
28
Issue :
6
Database :
MEDLINE
Journal :
IEEE journal of biomedical and health informatics
Publication Type :
Academic Journal
Accession number :
38349826
Full Text :
https://doi.org/10.1109/JBHI.2024.3365784