Management of Multimodal User Interaction in Companion-Systems
While interacting, human beings continuously adapt their way of communication to their surroundings and their communication partner. Although present context-aware ubiquitous systems gather a lot of information to maximize their functionality, they predominantly offer rather static ways to communica...
Saved in:
Published in | Companion Technology pp. 187 - 207 |
---|---|
Main Authors | , , , , |
Format | Book Chapter |
Language | English |
Published |
Cham
Springer International Publishing
01.01.2017
|
Series | Cognitive Technologies |
Online Access | Get full text |
Cover
Loading…
Summary: | While interacting, human beings continuously adapt their way of communication to their surroundings and their communication partner. Although present context-aware ubiquitous systems gather a lot of information to maximize their functionality, they predominantly offer rather static ways to communicate. In order to fulfill the user’s communication needs and demands, ubiquitous sensors’ varied information could be used to dynamically adapt the user interface. Considering such an adaptive user interface management as a major and relevant component for a Companion-Technology, we also have to cope with emotional and dispositional user input as a source of implicit user requests and demands. In this chapter we demonstrate how multimodal fusion based on evidential reasoning and probabilistic fission with adaptive reasoning can act together to form a highly adaptive and model-driven interactive system component for multimodal interaction. The presented interaction management (IM) can handle uncertain or ambiguous data throughout the complete interaction cycle with a user. In addition, we present the IM’s architecture and its model-driven concept. Finally, we discuss its role within the framework of the other constituents of a Companion-Technology. |
---|---|
ISBN: | 3319436643 9783319436647 |
ISSN: | 1611-2482 2197-6635 |
DOI: | 10.1007/978-3-319-43665-4_10 |