Back to Search Start Over

Multi-focus image fusion combining focus-region-level partition and pulse-coupled neural network.

Authors :
He, Kangjian
Zhou, Dongming
Zhang, Xuejie
Nie, Rencan
Jin, Xin
Source :
Soft Computing - A Fusion of Foundations, Methodologies & Applications. Jul2019, Vol. 23 Issue 13, p4685-4699. 15p.
Publication Year :
2019

Abstract

Multi-scale transforms (MST)-based methods are popular for multi-focus image fusion recently because of the superior performances, such as the fused image containing more details of edges and textures. However, most of MST-based methods are based on pixel operations, which require a large amount of data processing. Moreover, different fusion strategies cannot completely preserve the clear pixels within the focused area of the source image to obtain the fusion image. To solve these problems, this paper proposes a novel image fusion method based on focus-region-level partition and pulse-coupled neural network (PCNN) in nonsubsampled contourlet transform (NSCT) domain. A clarity evaluation function is constructed to measure which regions in the source image are focused. By removing the focused regions from the source images, the non-focus regions which contain the edge pixels of the focused regions are obtained. Next, the non-focus regions are decomposed into a series of subimages using NSCT, and subimages are fused using different strategies to obtain the fused non-focus regions. Eventually, the fused result is obtained by fusing the focused regions and the fused non-focus regions. Experimental results show that the proposed fusion scheme can retain more clear pixels of two source images and preserve more details of the non-focus regions, which is superior to conventional methods in visual inspection and objective evaluations. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
14327643
Volume :
23
Issue :
13
Database :
Academic Search Index
Journal :
Soft Computing - A Fusion of Foundations, Methodologies & Applications
Publication Type :
Academic Journal
Accession number :
136694164
Full Text :
https://doi.org/10.1007/s00500-018-3118-9