Boosting Deepfake Feature Extractors Using Unsupervised Domain Adaptation

To make deepfake detectors generalizable to different target domains, one effective way is to let the source domain for training be similar to the target domain for detection. This letter tackles the problem from the perspective of domain adaptation and achieves both image-level and feature-level do...

Full description

Saved in:
Bibliographic Details
Published inIEEE signal processing letters Vol. 31; pp. 2010 - 2014
Main Authors Li, Jicheng, Hu, Yongjian, Liu, Beibei, Gong, Zhaolong, Kang, Xiangui
Format Journal Article
LanguageEnglish
Published New York IEEE 2024
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:To make deepfake detectors generalizable to different target domains, one effective way is to let the source domain for training be similar to the target domain for detection. This letter tackles the problem from the perspective of domain adaptation and achieves both image-level and feature-level domains alignments. The proposed unsupervised domain adapter accomplishes the image-level domain alignment relying on the combination of cross-domain style feature mixing and diffusion model, and the feature-level domain alignment relying on prototypical consistency guided supervision and adversarial learning. The style is transferred from source to target for the generation of target data proxy in the form of stylized images. The features of the stylized images are further aligned with the target prototypical features. We apply the domain adapter as a feature booster to four current deepfake detectors. Experimental results show that all the detectors get a significant increase in AUC values on cross-dataset testings. We further propose a deepfake detector based on the Xception backbone with our booster. Compared with five state-of-the-art detectors, the proposed detector performs best in all experiments.
ISSN:1070-9908
1558-2361
DOI:10.1109/LSP.2024.3433546