Back to Search Start Over

GIPC-GAN: an end-to-end gradient and intensity joint proportional constraint generative adversarial network for multi-focus image fusion

Authors :
Junwu Li
Binhua Li
Yaoxi Jiang
Source :
Complex & Intelligent Systems, Vol 9, Iss 6, Pp 7395-7422 (2023)
Publication Year :
2023
Publisher :
Springer, 2023.

Abstract

Abstract As for the problems of boundary blurring and information loss in the multi-focus image fusion method based on the generative decision maps, this paper proposes a new gradient-intensity joint proportional constraint generative adversarial network for multi-focus image fusion, with the name of GIPC-GAN. First, a set of labeled multi-focus image datasets using the deep region competition algorithm on a public dataset is constructed. It can train the network and generate fused images in an end-to-end manner, while avoiding boundary errors caused by artificially constructed decision maps. Second, the most meaningful information in the multi-focus image fusion task is defined as the target intensity and detail gradient, and a jointly constrained loss function based on intensity and gradient proportional maintenance is proposed. Constrained by a specific loss function to force the generated image to retain the information of target intensity, global texture and local texture of the source image as much as possible and maintain the structural consistency between the fused image and the source image. Third, we introduce GAN into the network, and establish an adversarial game between the generator and the discriminator, so that the intensity structure and texture gradient retained by the fused image are kept in a balance, and the detailed information of the fused image is further enhanced. Last but not least, experiments are conducted on two multi-focus public datasets and a multi-source multi-focus image sequence dataset and compared with other 7 state-of-the-art algorithms. The experimental results show that the images fused by the GIPC-GAN model are superior to other comparison algorithms in both subjective performance and objective measurement, and basically meet the requirements of real-time image fusion in terms of running efficiency and mode parameters quantity.

Details

Language :
English
ISSN :
21994536 and 21986053
Volume :
9
Issue :
6
Database :
Directory of Open Access Journals
Journal :
Complex & Intelligent Systems
Publication Type :
Academic Journal
Accession number :
edsdoj.27e845a93ee44385bab21f909e0f6137
Document Type :
article
Full Text :
https://doi.org/10.1007/s40747-023-01151-y