Calibrating Head Pose Estimation in Videos for Meeting Room Event Analysis

In this paper, we study the calibration of head pose estimation in stereo camera setting for meeting room video event analysis. Head pose information infers the direction of attention of the subjects in video, therefore is valuable for video event analysis/indexing, especially in meeting room scenar...

Full description

Saved in:
Bibliographic Details
Published in2006 International Conference on Image Processing pp. 3193 - 3196
Main Authors Tu, J., Huang, T., Xiong, Y., Rose, T., Quek, F.
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.10.2006
Subjects
Online AccessGet full text
ISBN9781424404803
1424404800
ISSN1522-4880
DOI10.1109/ICIP.2006.313066

Cover

More Information
Summary:In this paper, we study the calibration of head pose estimation in stereo camera setting for meeting room video event analysis. Head pose information infers the direction of attention of the subjects in video, therefore is valuable for video event analysis/indexing, especially in meeting room scenario. We are developing a multi-modal meeting room data analyzing system for studying meeting room interaction dynamics, in which head pose estimation is one of the key components. As each subject in the meeting room can be observed by a pair of stereo cameras, we do 2D head tracking for the subject in each camera, and the 3D coordinate of the head can be obtained by triangulation. The 3D head pose is estimated in one of the camera coordinate system, we develop a procedure to accurately convert the estimated 3D pose in the camera coordinate system to that in the world coordinate system. In the experiment, visualization of the estimated head pose and location in world coordinate system verifies the soundness of our design. The estimated head pose and 3D location of the subjects in the meeting room allows further analysis of meeting room interaction dynamics, such as F-formation, floor-control, etc.
ISBN:9781424404803
1424404800
ISSN:1522-4880
DOI:10.1109/ICIP.2006.313066