Utilization of the YOLOv8 Methodology for Individual Identification Through Facial Recognition

Traditional methods for person detection primarily focused on different camera angles and frontal views. In this paper, we address various challenges in handling complex scenarios, such as disproportionate lighting conditions, color and grayscale variations, low-resolution images, including noise an...

Full description

Saved in:
Bibliographic Details
Published inElectrical and Computer Engineering (ICECE), International Conference on pp. 651 - 656
Main Authors Hasan, Walid, Mahira, Sumaya, Hosen, Golam Shakib, Harun-Ar-Rashid, Md, Mollah, Torikuzzaman
Format Conference Proceeding
LanguageEnglish
Published IEEE 18.12.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Traditional methods for person detection primarily focused on different camera angles and frontal views. In this paper, we address various challenges in handling complex scenarios, such as disproportionate lighting conditions, color and grayscale variations, low-resolution images, including noise and blur, large-scale face databases, variability in pose and viewpoint for real-time processing. We propose the YOLOv8 model, (a recent update of the YOLO, the "You Only Look Once" deep learning in real-time deep learning object detection algorithm that classifies and detects objects in a single pass through a neural network), which now includes key innovations, such as spatial attention, feature fusion with C2f module, bottlenecks and the Spatial Pyramid Pooling Fast (SPPF), to implement our model for person identification, and deploy the model in real-world scenarios where real-time processing is essential. With an extensive collection of our own customized dataset, Roboflow was instrumental for data preprocessing, helping us to achieve mean average precision (mAP) accuracy rate of 80.4 \% after training the YOLOv8 model. This result demonstrates the model's robustness and potential for deployment in challenging environments requiring real-time processing.
ISSN:2771-7917
DOI:10.1109/ICECE64886.2024.11024852