Transfer Learning Based Fine-Tuned Novel Approach for Detecting Facial Retouching

Facial retouching, also referred to as digital retouching, is the process of modifying or enhancing facial characteristics in digital images or photographs. While it can be a valuable technique for fixing flaws or achieving a desired visual appeal, it also gives rise to ethical considerations. This...

Full description

Saved in:
Bibliographic Details
Published inIraqi journal for electrical and electronic engineering Vol. 20; no. 1; pp. 84 - 94
Main Authors Sheth, Kinjal, Vora, Vishal
Format Journal Article
LanguageEnglish
Published 15.06.2024
Online AccessGet full text

Cover

Loading…
More Information
Summary:Facial retouching, also referred to as digital retouching, is the process of modifying or enhancing facial characteristics in digital images or photographs. While it can be a valuable technique for fixing flaws or achieving a desired visual appeal, it also gives rise to ethical considerations. This study involves categorizing genuine and retouched facial images from the standard ND-IIITD retouched faces dataset using a transfer learning methodology. The impact of different primary optimization algorithms—specifically Adam, RMSprop, and Adadelta—utilized in conjunction with a fine-tuned ResNet50 model is examined to assess potential enhancements in classification effectiveness. Our proposed transfer learning ResNet50 model demonstrates superior performance compared to other existing approaches, particularly when the RMSprop and Adam optimizers are employed in the fine-tuning process. By training the transfer learning ResNet50 model on the ND-IIITD retouched faces dataset with the "ImageNet" weight, we achieve a validation accuracy of 98.76%, a training accuracy of 98.32%, and an overall accuracy of 98.52% for classifying real and retouched faces in just 20 epochs. Comparative analysis indicates that the choice of optimizer during the fine-tuning of the transfer learning ResNet50 model can further enhance the classification accuracy.
ISSN:1814-5892
2078-6069
DOI:10.37917/ijeee.20.1.9