Back to Search Start Over

G2NPAN: GAN-guided nuance perceptual attention network for multimodal medical fusion image quality assessment.

Authors :
Chuangeng Tian
Lei Zhang
Source :
Frontiers in Neuroscience; 2024, p1-11, 11p
Publication Year :
2024

Abstract

Multimodal medical fusion images (MMFI) are formed by fusing medical images of two or more modalities with the aim of displaying as much valuable information as possible in a single image. However, due to the different strategies of various fusion algorithms, the quality of the generated fused images is uneven. Thus, an effective blind image quality assessment (BIQA) method is urgently required. The challenge of MMFI quality assessment is to enable the network to perceive the nuances between fused images of different qualities, and the key point for the success of BIQA is the availability of valid reference information. To this end, this work proposes a generative adversarial network (GAN) -guided nuance perceptual attention network (G2NPAN) to implement BIQA for MMFI. Specifically, we achieve the blind evaluation style via the design of a GAN and develop a Unique FeatureWarehouse module to learn the effective features of fused images from the pixel level. The redesigned loss function guides the network to perceive the image quality. In the end, the class activation mapping supervised quality assessment network is employed to obtain the MMFI quality score. Extensive experiments and validation have been conducted in a database of medical fusion images, and the proposed method is superior to the state-of-the-art BIQA method. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
16624548
Database :
Complementary Index
Journal :
Frontiers in Neuroscience
Publication Type :
Academic Journal
Accession number :
177499506
Full Text :
https://doi.org/10.3389/fnins.2024.1415679