Robust head pose estimation using Dirichlet-tree distribution enhanced random forests
Head pose estimation (HPE) is important in human–machine interfaces. However, various illumination, occlusion, low image resolution and wide scene make the estimation task difficult. Hence, a Dirichlet-tree distribution enhanced Random Forests approach (D-RF) is proposed in this paper to estimate he...
Saved in:
Published in | Neurocomputing (Amsterdam) Vol. 173; pp. 42 - 53 |
---|---|
Main Authors | , , , , , , |
Format | Journal Article |
Language | English |
Published |
Elsevier B.V
15.01.2016
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Head pose estimation (HPE) is important in human–machine interfaces. However, various illumination, occlusion, low image resolution and wide scene make the estimation task difficult. Hence, a Dirichlet-tree distribution enhanced Random Forests approach (D-RF) is proposed in this paper to estimate head pose efficiently and robustly in unconstrained environment. First, positive/negative facial patch is classified to eliminate influence of noise and occlusion. Then, the D-RF is proposed to estimate the head pose in a coarse-to-fine way using more powerful combined texture and geometric features of the classified positive patches. Furthermore, multiple probabilistic models have been learned in the leaves of the D-RF and a composite weighted voting method is introduced to improve the discrimination capability of the approach. Experiments have been done on three standard databases including two public databases and our lab database with head pose spanning from −90° to 90° in vertical and horizontal directions under various conditions, the average accuracy rate reaches 76.2% with 25 classes. The proposed approach has also been evaluated with the low resolution database collected from an overhead camera in a classroom, the average accuracy rate reaches 80.5% with 15 classes. The encouraging results suggest a strong potential for head pose and attention estimation in unconstrained environment. |
---|---|
ISSN: | 0925-2312 1872-8286 |
DOI: | 10.1016/j.neucom.2015.03.096 |