Back to Search Start Over

G2NPAN: GAN-guided nuance perceptual attention network for multimodal medical fusion image quality assessment

Authors :
Chuangeng Tian
Lei Zhang
Source :
Frontiers in Neuroscience, Vol 18 (2024)
Publication Year :
2024
Publisher :
Frontiers Media S.A., 2024.

Abstract

Multimodal medical fusion images (MMFI) are formed by fusing medical images of two or more modalities with the aim of displaying as much valuable information as possible in a single image. However, due to the different strategies of various fusion algorithms, the quality of the generated fused images is uneven. Thus, an effective blind image quality assessment (BIQA) method is urgently required. The challenge of MMFI quality assessment is to enable the network to perceive the nuances between fused images of different qualities, and the key point for the success of BIQA is the availability of valid reference information. To this end, this work proposes a generative adversarial network (GAN) -guided nuance perceptual attention network (G2NPAN) to implement BIQA for MMFI. Specifically, we achieve the blind evaluation style via the design of a GAN and develop a Unique Feature Warehouse module to learn the effective features of fused images from the pixel level. The redesigned loss function guides the network to perceive the image quality. In the end, the class activation mapping supervised quality assessment network is employed to obtain the MMFI quality score. Extensive experiments and validation have been conducted in a database of medical fusion images, and the proposed method is superior to the state-of-the-art BIQA method.

Details

Language :
English
ISSN :
1662453X
Volume :
18
Database :
Directory of Open Access Journals
Journal :
Frontiers in Neuroscience
Publication Type :
Academic Journal
Accession number :
edsdoj.60d65b0cf7a488d876dc02a2ae3db22
Document Type :
article
Full Text :
https://doi.org/10.3389/fnins.2024.1415679