Involuntary orienting to sound improves visual perception

To perceive real-world objects and events, we need to integrate several stimulus features belonging to different sensory modalities. Although the neural mechanisms and behavioural consequences of intersensory integration have been extensively studied, the processes that enable us to pay attention to...

Full description

Saved in:
Bibliographic Details
Published inNature (London) Vol. 407; no. 6806; pp. 906 - 908
Main Authors McDonald, John J, Teder-Sälejärvi, Wolfgang A, Hillyard, Steven A
Format Journal Article
LanguageEnglish
Published London Nature Publishing 19.10.2000
Nature Publishing Group
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:To perceive real-world objects and events, we need to integrate several stimulus features belonging to different sensory modalities. Although the neural mechanisms and behavioural consequences of intersensory integration have been extensively studied, the processes that enable us to pay attention to multimodal objects are still poorly understood. An important question is whether a stimulus in one sensory modality automatically attracts attention to spatially coincident stimuli that appear subsequently in other modalities, thereby enhancing their perceptual salience. The occurrence of an irrelevant sound does facilitate motor responses to a subsequent light appearing nearby. However, because participants in previous studies made speeded responses rather than psychophysical judgements, it remains unclear whether involuntary auditory attention actually affects the perceptibility of visual stimuli as opposed to postperceptual decision and response processes. Here we provide psychophysical evidence that a sudden sound improves the detectability of a subsequent flash appearing at the same location. These data show that the involuntary orienting of attention to sound enhances early perceptual processing of visual stimuli.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ObjectType-Article-1
ObjectType-Feature-2
ISSN:0028-0836
1476-4687
DOI:10.1038/35038085