RNN-Stega: Linguistic Steganography Based on Recurrent Neural Networks

Linguistic steganography based on text carrier auto-generation technology is a current topic with great promise and challenges. Limited by the text automatic generation technology or the corresponding text coding methods, the quality of the steganographic text generated by previous methods is inferi...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on information forensics and security Vol. 14; no. 5; pp. 1280 - 1295
Main Authors Yang, Zhong-Liang, Guo, Xiao-Qing, Chen, Zi-Ming, Huang, Yong-Feng, Zhang, Yu-Jin
Format Journal Article
LanguageEnglish
Published New York IEEE 01.05.2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text
ISSN1556-6013
1556-6021
DOI10.1109/TIFS.2018.2871746

Cover

Loading…
More Information
Summary:Linguistic steganography based on text carrier auto-generation technology is a current topic with great promise and challenges. Limited by the text automatic generation technology or the corresponding text coding methods, the quality of the steganographic text generated by previous methods is inferior, which makes its imperceptibility unsatisfactory. In this paper, we propose a linguistic steganography based on recurrent neural networks, which can automatically generate high-quality text covers on the basis of a secret bitstream that needs to be hidden. We trained our model with a large number of artificially generated samples and obtained a good estimate of the statistical language model. In the text generation process, we propose fixed-length coding and variable-length coding to encode words based on their conditional probability distribution. We designed several experiments to test the proposed model from the perspectives of information hiding efficiency, information imperceptibility, and information hidden capacity. The experimental results show that the proposed model outperforms all the previous related methods and achieves the state-of-the-art performance.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1556-6013
1556-6021
DOI:10.1109/TIFS.2018.2871746