Advocating Pixel-Level Authentication of Camera-Captured Images

The authenticity of digital images posted online and shared on social media is often questioned due to the ability of photo-editing software to alter image content and generative AI methods that can produce visually compelling deepfakes. Only images directly produced by cameras are deemed unaltered...

Full description

Saved in:
Bibliographic Details
Published inIEEE access Vol. 12; pp. 45839 - 45846
Main Authors Punnappurath, Abhijith, Zhao, Luxi, Abdelhamed, Abdelrahman, Brown, Michael S.
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 2024
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The authenticity of digital images posted online and shared on social media is often questioned due to the ability of photo-editing software to alter image content and generative AI methods that can produce visually compelling deepfakes. Only images directly produced by cameras are deemed unaltered and beyond suspicion, as they have not undergone any modifications. However, there is a recent trend among camera manufacturers to integrate AI-based modules into the dedicated onboard hardware, specifically the image signal processor (ISP), responsible for processing the captured sensor image into the final saved image for users. Many of these AI modules utilize perceptual or generative losses during training, which can "hallucinate" image content. While this hallucinated content often manifests as small details and textures, there are instances where these regions unintentionally impact the interpretation of the entire image. This paper aims to bring attention to this issue and advocate for in-camera strategies to validate the authenticity of camera-captured images at a pixel level. We propose the creation of an "authenticity" mask that could be stored as additional metadata with each image. This information can be extracted and overlaid on the image to easily identify the hallucinated regions. Considering the widespread implications of image authenticity (e.g., in courtroom evidence, news broadcasts, and other media forms), we anticipate that authentication metadata will become a standard practice for any ISP utilizing AI.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2024.3381521