Auditory selective attention is enhanced by a task-irrelevant temporally coherent visual stimulus in human listeners

In noisy settings, listening is aided by correlated dynamic visual cues gleaned from a talker's face-an improvement often attributed to visually reinforced linguistic information. In this study, we aimed to test the effect of audio-visual temporal coherence alone on selective listening, free of...

Full description

Saved in:
Bibliographic Details
Published ineLife Vol. 4
Main Authors Maddox, Ross K, Atilgan, Huriye, Bizley, Jennifer K, Lee, Adrian K C
Format Journal Article
LanguageEnglish
Published England eLife Sciences Publications Ltd 05.02.2015
eLife Sciences Publications, Ltd
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In noisy settings, listening is aided by correlated dynamic visual cues gleaned from a talker's face-an improvement often attributed to visually reinforced linguistic information. In this study, we aimed to test the effect of audio-visual temporal coherence alone on selective listening, free of linguistic confounds. We presented listeners with competing auditory streams whose amplitude varied independently and a visual stimulus with varying radius, while manipulating the cross-modal temporal relationships. Performance improved when the auditory target's timecourse matched that of the visual stimulus. The fact that the coherence was between task-irrelevant stimulus features suggests that the observed improvement stemmed from the integration of auditory and visual streams into cross-modal objects, enabling listeners to better attend the target. These findings suggest that in everyday conditions, where listeners can often see the source of a sound, temporal cues provided by vision can help listeners to select one sound source from a mixture.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:2050-084X
2050-084X
DOI:10.7554/eLife.04995