Gait Biometric-based Human Recognition System Using Deep Convolutional Neural Network in Surveillance System

At present, automatic person identification is one of the most useful tasks in many fields, such as biometric authentication, security control, and video surveillance system. Physical biometric data such as the face, iris, and fingerprints are the main characteristics to recognize people. However, t...

Full description

Saved in:
Bibliographic Details
Published in2020 Asia Conference on Computers and Communications (ACCC) pp. 47 - 51
Main Authors Aung, Hsu Mon Lei, Pluempitiwiriyawej, Charnchai
Format Conference Proceeding
LanguageEnglish
Published IEEE 18.09.2020
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:At present, automatic person identification is one of the most useful tasks in many fields, such as biometric authentication, security control, and video surveillance system. Physical biometric data such as the face, iris, and fingerprints are the main characteristics to recognize people. However, this information can be easily stolen and can be challenging to get far away from the camera. It becomes a problem in person identification to prove reliability. Furthermore, face features based person identification is not natural to recognize people in some circumstances, such as a considerable distance from the camera, poor lighting conditions, and large occlusions. The other biometric features as gait, skeleton data, full-body gestures, and poses are complicated to imitate and can capture in long-distance. In this paper, the gait biometric-based person identification using a deep convolutional neural network (CNN)method is proposed to recognize the critical discriminative gait-features for human identification. The technique uses Gait Energy Images (GEI) of humans for identification. The proposed method has evaluated on the CASIA-B gait dataset. The empirical results showed suitable effectiveness compared with the state-of-art machine learning model for recognition in the environment with clothes changes and viewing angles variation.
DOI:10.1109/ACCC51160.2020.9347899