Information fusion from multiple cameras for gait-based re-identification and recognition

In this study, the authors present a fully automated frontal (i.e. employing front and back views only) gait recognition approach using the depth information captured by multiple Kinect RGB-D cameras. Limited depth sensing range restricts each of these Kinects to record only a part of a complete gai...

Full description

Saved in:
Bibliographic Details
Published inIET image processing Vol. 9; no. 11; pp. 969 - 976
Main Authors Chattopadhyay, Pratik, Sural, Shamik, Mukherjee, Jayanta
Format Journal Article
LanguageEnglish
Published The Institution of Engineering and Technology 01.11.2015
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In this study, the authors present a fully automated frontal (i.e. employing front and back views only) gait recognition approach using the depth information captured by multiple Kinect RGB-D cameras. Limited depth sensing range restricts each of these Kinects to record only a part of a complete gait cycle of a walking subject. Hence, information from more than one Kinect is fused together to examine which features of a gait cycle can be conveniently extracted from the sequences captured independently by these cameras. To achieve this, it is imperative that the same subject be re-identified as he moves from the field of view of one camera to another. The authors use a set of soft-biometric features computed from the skeleton stream provided by Kinect software development kit) for doing automatic re-identification. To enable such information fusion and also to handle missing components even after re-identification, features are extracted at the granularity of small fractions of a gait cycle. Experiments carried out on a data set with gait videos captured by Kinects respectively from the back and front views show promising results.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1751-9659
1751-9667
DOI:10.1049/iet-ipr.2014.0773