Robust JPEG steganography based on DCT and SVD in nonsubsampled shearlet transform domain

Social media platform such as WeChat provides rich cover images for covert communication by steganography. However, in order to save band-width, storage space and make images load faster, the images often will be compressed, which makes the image steganography algorithms designed for lossless networ...

Full description

Saved in:
Bibliographic Details
Published inMultimedia tools and applications Vol. 81; no. 25; pp. 36453 - 36472
Main Authors Song, Xiaofeng, Yang, Chunfang, Han, Kun, Ding, Shichang
Format Journal Article
LanguageEnglish
Published New York Springer US 01.10.2022
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Social media platform such as WeChat provides rich cover images for covert communication by steganography. However, in order to save band-width, storage space and make images load faster, the images often will be compressed, which makes the image steganography algorithms designed for lossless network channels unusable. Based on DCT and SVD in nonsubsampled shearlet transform domain, a robust JPEG steganography algorithm is proposed, which can resist image compression and correctly extract the embedded secret message from the compressed stego image. First, by combining the advantages of nonsubsampled shearlet transform, DCT and SVD, the construction method for robust embedding domain is proposed. Then, based on minimal distortion principle, the framework of the proposed robust JPEG steganography algorithm is given and the key steps are described in details. The experimental results show that the proposed JPEG steganography algorithm can achieve competitive robustness and anti-detection capability in contrast to the state-of-the-art robust steganography algorithms. Moreover, it can extract the secret message correctly even if the stego image is compressed by WeChat.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1380-7501
1573-7721
DOI:10.1007/s11042-022-13525-4