Comparative analysis of deep learning models for detecting face mask
The spread of Corona Virus Disease 19 (COVID-19) in Indonesia is still relatively high and has not shown a significant decrease. One of the main reasons is due to the lack of supervision on the implementation of health protocols such as wearing masks in daily activities. Recently, state-of-the-art a...
Saved in:
Published in | Procedia computer science Vol. 216; pp. 48 - 56 |
---|---|
Main Authors | , , , , , , , |
Format | Journal Article |
Language | English |
Published |
Netherlands
Elsevier B.V
01.01.2023
Published by Elsevier B.V |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | The spread of Corona Virus Disease 19 (COVID-19) in Indonesia is still relatively high and has not shown a significant decrease. One of the main reasons is due to the lack of supervision on the implementation of health protocols such as wearing masks in daily activities. Recently, state-of-the-art algorithms were introduced to automate face mask detection. To be more specific, the researchers developed various kinds of architectures for the detection of masks based on computer vision methods. This paper aims to evaluate well-known architectures, namely the ResNet50, VGG11, InceptionV3, EfficientNetB4, and YOLO (You Only Look Once) to recommend the best approach in this specific field. By using the MaskedFace-Net dataset, the experimental results showed that the EfficientNetB4 architecture has better accuracy at 95.77% compared to the YOLOv4 architecture of 93.40%, InceptionV3 of 87.30%, YOLOv3 of 86.35%, ResNet50 of 84.41%, VGG11 of 84.38%, and YOLOv2 of 78.75%, respectively. It should be noted that particularly for YOLO, the model was trained using a collection of MaskedFace-Net images that had been pre-processed and labelled for the task. The model was initially able to train faster with pre-trained weights from the COCO dataset thanks to transfer learning, resulting in a robust set of features expected for face mask detection and classification. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
ISSN: | 1877-0509 1877-0509 |
DOI: | 10.1016/j.procs.2022.12.110 |