Local Associated Features for Pedestrian Detection

Local features are usually used to describe pedestrian appearance. While most of existing pedestrian detection methods don’t make full use of context cues, such as associated relationships between local different locations. This paper proposes two novel kinds of local associated features, gradient o...

Full description

Saved in:
Bibliographic Details
Published inComputer Vision - ACCV 2014 Workshops pp. 513 - 526
Main Authors Shao, Song, Liu, Hong, Wang, Xiangdong, Qian, Yueliang
Format Book Chapter
LanguageEnglish
Published Cham Springer International Publishing 2015
SeriesLecture Notes in Computer Science
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Local features are usually used to describe pedestrian appearance. While most of existing pedestrian detection methods don’t make full use of context cues, such as associated relationships between local different locations. This paper proposes two novel kinds of local associated features, gradient orientation associated feature (GOAF) and local difference of ACF (ACF-LD), to exploit context information. In our work, pedestrian samples are enlarged to contain some background regions besides human body, and GOAF, ACF and ACF-LD are combined together to describe pedestrian sample. GOAF are constructed by encoding gradient orientation features from two different positions into a single value. These two positions are come from different distance and different direction. For ACF-LD, the sample is divided into several sub regions and the ACF difference matrixes between these areas are computed to exploit the associated information between pedestrian and surrounding background. The proposed local associated features can provide complementary information for detection tasks. Finally, these features are fused with ACF to form candidate feature pool, and AdaBoost is used to select features and train a cascaded classifier of depth-two decision trees. Experimental results on two public datasets show that the proposed framework can achieve promising results compared with the state of the arts.
ISBN:9783319166278
3319166271
ISSN:0302-9743
1611-3349
DOI:10.1007/978-3-319-16628-5_37