Pedestrian Detection and Tracking Using a Mixture of View-Based Shape-Texture Models

This paper presents a robust multicue approach to the integrated detection and tracking of pedestrians in a cluttered urban environment. A novel spatiotemporal object representation is proposed, which combines a generative shape model and a discriminative texture classifier, both of which are compos...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on intelligent transportation systems Vol. 9; no. 2; pp. 333 - 343
Main Authors Munder, S., Schnorr, C., Gavrila, D.M.
Format Journal Article
LanguageEnglish
Published Piscataway, NJ IEEE 01.06.2008
Institute of Electrical and Electronics Engineers
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This paper presents a robust multicue approach to the integrated detection and tracking of pedestrians in a cluttered urban environment. A novel spatiotemporal object representation is proposed, which combines a generative shape model and a discriminative texture classifier, both of which are composed of a mixture of pose-specific submodels. Shape is represented by a set of linear subspace models, which is an extension of point distribution models, with shape transitions being modeled by a first-order Markov process. Texture, i.e., the shape-normalized intensity pattern, is represented by a manifold that is implicitly delimited by a set of pattern classifiers, whereas texture transition is modeled by a random walk. Direct 3-D measurements that are provided by a stereo system are further incorporated into the observation density function. We employ a Bayesian framework based on particle filtering to achieve integrated object detection and tracking. Large-scale experiments that involve pedestrian detection and tracking from a moving vehicle demonstrate the benefit of the proposed approach.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ISSN:1524-9050
1558-0016
DOI:10.1109/TITS.2008.922943