Frontal gait recognition from occluded scenes
•First ever work on frontal gait recognition in the presence of occlusion.•Fusion of front and back view features extracted from Kinect used for recognition.•Encouraging results obtained even for high degrees of occlusion. In this paper, we propose a method using Kinect depth data to address the pro...
Saved in:
Published in | Pattern recognition letters Vol. 63; pp. 9 - 15 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
Elsevier B.V
01.10.2015
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | •First ever work on frontal gait recognition in the presence of occlusion.•Fusion of front and back view features extracted from Kinect used for recognition.•Encouraging results obtained even for high degrees of occlusion.
In this paper, we propose a method using Kinect depth data to address the problem of occlusion in frontal gait recognition. We consider situations where such depth cameras are mounted on top of entry and exit points of a zone under surveillance, respectively capturing the back and front views of each subject passing through the zone. A feature set corresponding to the back view is derived from the depth information along the contour of the silhouette, while periodic variation of the skeleton structure of the lower body region as estimated by Kinect is extracted from the front view. These feature sets preserve gait dynamics at a high resolution and can be extracted efficiently. In congested places like airports, railway stations and shopping malls, multiple persons move into the surveillance zone one after another, thereby causing occlusion of the target. The proposed recognition procedure compares the unoccluded frames of a cluttered test sequence with the matching frames of a training sequence. Dynamic programming based local sequence alignment is used to determine this frame correspondence. The method is computationally efficient and shows encouraging results under different levels of occlusion. |
---|---|
ISSN: | 0167-8655 1872-7344 |
DOI: | 10.1016/j.patrec.2015.06.004 |