A Passive Approach for Detecting Image Splicing Based on Deep Learning and Wavelet Transform

Splicing image forgery detection has become a significant research subject in multimedia forensics and security due to its widespread use and its hard detection. Many algorithms have already been executed on the image splicing. The existing algorithms may be affected by some problems, such as high f...

Full description

Saved in:
Bibliographic Details
Published inArabian journal for science and engineering (2011) Vol. 45; no. 4; pp. 3379 - 3386
Main Authors Abd El-Latif, Eman I., Taha, Ahmed, Zayed, Hala H.
Format Journal Article
LanguageEnglish
Published Berlin/Heidelberg Springer Berlin Heidelberg 01.04.2020
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Splicing image forgery detection has become a significant research subject in multimedia forensics and security due to its widespread use and its hard detection. Many algorithms have already been executed on the image splicing. The existing algorithms may be affected by some problems, such as high feature dimensionality and low accuracy with high false positive rates. In this paper, an algorithm based on deep learning approach and wavelet transform is proposed to detect the spliced image. In the deep learning approach, convolutional neural network (CNN) is employed to automatically extract features from the spliced image. CNN is applied and then discrete wavelet transform (DWT) is used. Support vector machine is used later for classification. Additional experiments are performed. That is, discrete cosine transform replaces DWT and then principal component analysis is applied. The proposed algorithm is evaluated on a publicly available image splicing datasets (CASIA v1.0 and CASIA v2.0). It achieves high accuracy while using a relatively low-dimensional feature vector. Our results demonstrate that the proposed algorithm is effective and accomplishes better performance for detecting the spliced image.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2193-567X
1319-8025
2191-4281
DOI:10.1007/s13369-020-04401-0