Face recognition for occluded face with mask region convolutional neural network and fully convolutional network: a literature review

Face recognition technology has been used in many ways, such as in the authentication and identification process. The object raised is a piece of face image that does not have complete facial information (occluded face), it can be due to acquisition from a different point of view or shooting a face...

Full description

Saved in:
Bibliographic Details
Published inInternational journal of electrical and computer engineering (Malacca, Malacca) Vol. 13; no. 5; p. 5662
Main Authors Budiarsa, Rahmat, Wardoyo, Retantyo, Musdholifah, Aina
Format Journal Article
LanguageEnglish
Published 01.10.2023
Online AccessGet full text

Cover

Loading…
More Information
Summary:Face recognition technology has been used in many ways, such as in the authentication and identification process. The object raised is a piece of face image that does not have complete facial information (occluded face), it can be due to acquisition from a different point of view or shooting a face from a different angle. This object was raised because the object can affect the detection and identification performance of the face image as a whole. Deep leaning method can be used to solve face recognition problems. In previous research, more focused on face detection and recognition based on resolution, and detection of face. Mask region convolutional neural network (mask R-CNN) method still has deficiency in the segmentation section which results in a decrease in the accuracy of face identification with incomplete face information objects. The segmentation used in mask R-CNN is fully convolutional network (FCN). In this research, exploration and modification of many FCN parameters will be carried out using the CNN backbone pooling layer, and modification of mask R-CNN for face identification, besides that, modifications will be made to the bounding box regressor. it is expected that the modification results can provide the best recommendations based on accuracy.
ISSN:2088-8708
2722-2578
DOI:10.11591/ijece.v13i5.pp5662-5673