Can VR Help Understand Auditory-Visual Associations and Synaesthesia Through Immersive Battery Tests?
This paper gives an overview of my PhD research in the area of auditory-visual associations in Human-Computer Interaction (HCI). My project explores the condition called synaesthesia, and auditory-visual cross-modal associations, in order to gain a better understanding of how our auditory and visual...
Saved in:
Published in | 2022 10th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW) pp. 1 - 5 |
---|---|
Main Author | |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
18.10.2022
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | This paper gives an overview of my PhD research in the area of auditory-visual associations in Human-Computer Interaction (HCI). My project explores the condition called synaesthesia, and auditory-visual cross-modal associations, in order to gain a better understanding of how our auditory and visual senses combine to inform our perceptive and decision-makings skills. I seek to understand how technology can be used to help strengthen our auditory-visual associations. Previous work suggests that creating effective and immersive environments using Virtual Reality (VR) can help in gaining new insights into auditory-visual perception. Firstly, I will create a VR version of standard auditory-visual and synaesthesia tests, and conduct user studies using both traditional and the VR based tests, with a comparison of the results being investigated to see what impact using a more immersive environment, like VR, has on auditory-visual associations. Secondly, armed with new knowledge concerning auditory-visual associations using VR, I will create a framework that uses auditory-visual associations to help designers create interfaces that are personalised, adaptive and accessible to everyone. |
---|---|
DOI: | 10.1109/ACIIW57231.2022.10086003 |