Visual attention in spoken human-robot interaction
Psycholinguistic studies of situated language processing have revealed that gaze in the visual environment is tightly coupled with both spoken language comprehension and production. It has also been established that interlocutors monitor the gaze of their partners, a phenomenon called "joint at...
Saved in:
Published in | 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI) pp. 77 - 84 |
---|---|
Main Authors | , |
Format | Conference Proceeding |
Language | English |
Published |
New York, NY, USA
ACM
09.03.2009
IEEE |
Series | ACM Conferences |
Subjects | |
Online Access | Get full text |
ISBN | 1605584045 9781605584041 |
ISSN | 2167-2121 |
DOI | 10.1145/1514095.1514111 |
Cover
Loading…
Summary: | Psycholinguistic studies of situated language processing have revealed that gaze in the visual environment is tightly coupled with both spoken language comprehension and production. It has also been established that interlocutors monitor the gaze of their partners, a phenomenon called "joint attention", as a further means for facilitating mutual understanding. We hypothesise that human-robot interaction will benefit when the robot's language-related gaze behaviour is similar to that of people, potentially providing the user with valuable non-verbal information concerning the robot's intended message or the robot's successful understanding. We report findings from two eye-tracking experiments demonstrating (1) that human gaze is modulated by both the robot speech and gaze, and (2) that human comprehension of robot speech is improved when the robot's real-time gaze behaviour is similar to that of humans. |
---|---|
ISBN: | 1605584045 9781605584041 |
ISSN: | 2167-2121 |
DOI: | 10.1145/1514095.1514111 |