Multimodal control system for autonomous vehicles using speech and gesture recognition

The recent development of autonomous vehicles has attracted much attention, but operating these vehicles may be too complex for average users. Therefore, we propose an intuitive, multimodal interface for the control of autonomous vehicles using speech and gesture recognition to interpret and execute...

Full description

Saved in:
Bibliographic Details
Published inThe Journal of the Acoustical Society of America Vol. 140; no. 4; pp. 2963 - 2964
Main Authors Nakagawa, Takuma, Kitaoka, Norihide
Format Journal Article
LanguageEnglish
Japanese
Published 01.10.2016
Online AccessGet full text

Cover

Loading…
More Information
Summary:The recent development of autonomous vehicles has attracted much attention, but operating these vehicles may be too complex for average users. Therefore, we propose an intuitive, multimodal interface for the control of autonomous vehicles using speech and gesture recognition to interpret and execute the commands of users. For example, if the user says “turn there” while pointing at a landmark, the vehicle can utilize this behavior to correctly understand and comply with the user’s intent. To achieve this, we designed a two-part interface consisting of a multimodal understanding component and a dialog control component. Our multimodal understanding and dialog control components can be seen as a concatenation of two separate transducers. One transducer is used for multimodal understanding and the other for a conventional dialog system. We then construct a combined transducer from these two transducers. We developed various scenarios which might arise while operating an autonomous vehicle and displayed these scenes on a monitor. Subjects were then asked to operate a virtual car using speech commands and pointing gestures to control the vehicle while observing the monitor. The questionnaire results show that subjects felt they were able to easily and naturally operate the autonomous vehicle using utterances and gestures.
ISSN:0001-4966
1520-8524
DOI:10.1121/1.4969161