Bimodal speech: early suppressive visual effects in human auditory cortex

While everyone has experienced that seeing lip movements may improve speech perception, little is known about the neural mechanisms by which audiovisual speech information is combined. Event‐related potentials (ERPs) were recorded while subjects performed an auditory recognition task among four diff...

Full description

Saved in:
Bibliographic Details
Published inThe European journal of neuroscience Vol. 20; no. 8; pp. 2225 - 2234
Main Authors Besle, Julien, Fort, Alexandra, Delpuech, Claude, Giard, Marie-Hélène
Format Journal Article
LanguageEnglish
Published Oxford, UK Blackwell Science Ltd 01.10.2004
Wiley
Blackwell Science
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:While everyone has experienced that seeing lip movements may improve speech perception, little is known about the neural mechanisms by which audiovisual speech information is combined. Event‐related potentials (ERPs) were recorded while subjects performed an auditory recognition task among four different natural syllables randomly presented in the auditory (A), visual (V) or congruent bimodal (AV) condition. We found that: (i) bimodal syllables were identified more rapidly than auditory alone stimuli; (ii) this behavioural facilitation was associated with cross‐modal [AV − (A + V)] ERP effects around 120–190 ms latency, expressed mainly as a decrease of unimodal N1 generator activities in the auditory cortex. This finding provides evidence for suppressive, speech‐specific audiovisual integration mechanisms, which are likely to be related to the dominance of the auditory modality for speech perception. Furthermore, the latency of the effect indicates that integration operates at pre‐representational stages of stimulus analysis, probably via feedback projections from visual and/or polymodal areas.
Bibliography:istex:4A316C5518BD096D728B73BED3316969B36AE193
ArticleID:EJN3670
ark:/67375/WNG-THR4ZVP0-8
ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ISSN:0953-816X
1460-9568
DOI:10.1111/j.1460-9568.2004.03670.x