Back to Search Start Over

面向特征融合的图像多窜改检测与定位算法.

Authors :
兰萍
李燕
Source :
Application Research of Computers / Jisuanji Yingyong Yanjiu. Dec2022, Vol. 39 Issue 12, p3791-3796. 6p.
Publication Year :
2022

Abstract

Most of the existing image tampering detection methods are only for a certain tampering method, and there is a problem that the detection accuracy of the tampered area boundary is not high. In this regard, this paper proposed a U-shaped network based on dual-stream encoder-decoder architecture for image tampering. Firstly, the method used the skip connection between the encoder and the decoder to fuse the low-level and high-level features in the tampered image, and used the atrous convolution and CBAM attention mechanism to fuse the features output by the encoder, so that the network had better localization performance for tampered regions of different scales; Secondly, in order to improve the network’s detection accuracy of the boundary of the tampered area, this algorithm used the image morphological method to make a tampered boundary dataset; Finally, it used multiple loss functions to optimize the performance of the network simultaneously, namely it used cross-entropy and root-mean-square loss functions to measure the tampered region loss and tampered boundary loss of the prediction graph respectively. The experimental results on 4 public datasets: CASIA, Columbia, NIST16, Coverage show that the method proposed in this paper can effectively detect the tampering area of the fake image with splicing and copy-paste tampering methods, and output pixel-level tampering region location map. Compared with other mainstream tampering detection methods, the proposed method achieves the highest AUC values on the CASIA and Columbia dataset and the highest F1 values on the Columbia dataset. [ABSTRACT FROM AUTHOR]

Details

Language :
Chinese
ISSN :
10013695
Volume :
39
Issue :
12
Database :
Academic Search Index
Journal :
Application Research of Computers / Jisuanji Yingyong Yanjiu
Publication Type :
Academic Journal
Accession number :
160874121
Full Text :
https://doi.org/10.19734/j.issn.1001-3695.2022.04.0216