ReDMark: Framework for residual diffusion watermarking based on deep networks
•Learning new embedding patterns.•Customizing solutions for suggested transform domains and attacks.•Enhancing security and robustness by diffusing watermark in a wide area of images. Due to the rapid growth of machine learning tools and specifically deep networks in various computer vision and imag...
Saved in:
Published in | Expert systems with applications Vol. 146; p. 113157 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
New York
Elsevier Ltd
15.05.2020
Elsevier BV |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | •Learning new embedding patterns.•Customizing solutions for suggested transform domains and attacks.•Enhancing security and robustness by diffusing watermark in a wide area of images.
Due to the rapid growth of machine learning tools and specifically deep networks in various computer vision and image processing areas, applications of Convolutional Neural Networks for watermarking have recently emerged. In this paper, we propose a deep end-to-end diffusion watermarking framework (ReDMark) which can learn a new watermarking algorithm in any desired transform space. The framework is composed of two Fully Convolutional Neural Networks with residual structure which handle embedding and extraction operations in real-time. The whole deep network is trained end-to-end to conduct a blind secure watermarking. The proposed framework simulates various attacks as a differentiable network layer to facilitate end-to-end training. The watermark data is diffused in a relatively wide area of the image to enhance security and robustness of the algorithm. Comparative results versus recent state-of-the-art researches highlight the superiority of the proposed framework in terms of imperceptibility, robustness and speed. The source codes of the proposed framework are publicly available at Github11https://github.com/MahdiShAhmadi/ReDMark/tree/master/. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 0957-4174 1873-6793 |
DOI: | 10.1016/j.eswa.2019.113157 |