Multi-target tracking by on-line learned discriminative appearance models

We present an approach for online learning of discriminative appearance models for robust multi-target tracking in a crowded scene from a single camera. Although much progress has been made in developing methods for optimal data association, there has been comparatively less work on the appearance m...

Full description

Saved in:
Bibliographic Details
Published in2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition pp. 685 - 692
Main Authors Cheng-Hao Kuo, Chang Huang, Nevatia, Ramakant
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.06.2010
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We present an approach for online learning of discriminative appearance models for robust multi-target tracking in a crowded scene from a single camera. Although much progress has been made in developing methods for optimal data association, there has been comparatively less work on the appearance models, which are key elements for good performance. Many previous methods either use simple features such as color histograms, or focus on the discriminability between a target and the background which does not resolve ambiguities between the different targets. We propose an algorithm for learning a discriminative appearance model for different targets. Training samples are collected online from tracklets within a time sliding window based on some spatial-temporal constraints; this allows the models to adapt to target instances. Learning uses an Ad-aBoost algorithm that combines effective image descriptors and their corresponding similarity measurements. We term the learned models as OLDAMs. Our evaluations indicate that OLDAMs have significantly higher discrimination between different targets than conventional holistic color histograms, and when integrated into a hierarchical association framework, they help improve the tracking accuracy, particularly reducing the false alarms and identity switches.
ISBN:1424469848
9781424469840
ISSN:1063-6919
DOI:10.1109/CVPR.2010.5540148