Back to Search Start Over

RXDNFuse: A aggregated residual dense network for infrared and visible image fusion

Authors :
Yida Zhong
Haitao Jia
Yongzhi Long
Yadong Jiang
Yuming Jia
Source :
Information Fusion. 69:128-141
Publication Year :
2021
Publisher :
Elsevier BV, 2021.

Abstract

This study proposes a novel unsupervised network for IR/VIS fusion task, termed as RXDNFuse, which is based on the aggregated residual dense network. In contrast to conventional fusion networks, RXDNFuse is designed as an end-to-end model that combines the structural advantages of ResNeXt and DenseNet. Hence, it overcomes the limitations of the manual and complicated design of activity-level measurement and fusion rules. Our method establishes the image fusion problem into the structure and intensity proportional maintenance problem of the IR/VIS images. Using comprehensive feature extraction and combination, RXDNFuse automatically estimates the information preservation degrees of corresponding source images, and extracts hierarchical features to achieve effective fusion. Moreover, we design two loss function strategies to optimize the similarity constraint and the network parameter training, thus further improving the quality of detailed information. We also generalize RXDNFuse to fuse images with different resolutions and RGB scale images. Extensive qualitative and quantitative evaluations reveal that our results can effectively preserve the abundant textural details and the highlighted thermal radiation information. In particular, our results form a comprehensive representation of scene information, which is more in line with the human visual perception system.

Details

ISSN :
15662535
Volume :
69
Database :
OpenAIRE
Journal :
Information Fusion
Accession number :
edsair.doi...........4a933f63f823fd4b58ac103346e46f01
Full Text :
https://doi.org/10.1016/j.inffus.2020.11.009