dog's meow: asymmetrical interaction in cross-modal object recognition

Little is known on cross-modal interaction in complex object recognition. The factors influencing this interaction were investigated using simultaneous presentation of pictures and vocalizations of animals. In separate blocks, the task was to identify either the visual or the auditory stimulus, igno...

Full description

Saved in:
Bibliographic Details
Published inExperimental brain research Vol. 193; no. 4; pp. 603 - 614
Main Authors Yuval-Greenberg, Shlomit, Deouell, Leon Y
Format Journal Article
LanguageEnglish
Published Berlin/Heidelberg Berlin/Heidelberg : Springer-Verlag 01.03.2009
Springer-Verlag
Springer
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Little is known on cross-modal interaction in complex object recognition. The factors influencing this interaction were investigated using simultaneous presentation of pictures and vocalizations of animals. In separate blocks, the task was to identify either the visual or the auditory stimulus, ignoring the other modality. The pictures and the sounds were congruent (same animal), incongruent (different animals) or neutral (animal with meaningless stimulus). Performance in congruent trials was better than in incongruent trials, regardless of whether subjects attended the visual or the auditory stimuli, but the effect was larger in the latter case. This asymmetry persisted with addition of a long delay after the stimulus and before the response. Thus, the asymmetry cannot be explained by a lack of processing time for the auditory stimulus. However, the asymmetry was eliminated when low-contrast visual stimuli were used. These findings suggest that when visual stimulation is highly informative, it affects auditory recognition more than auditory stimulation affects visual recognition. Nevertheless, this modality dominance is not rigid; it is highly influenced by the quality of the presented information.
Bibliography:http://dx.doi.org/10.1007/s00221-008-1664-6
ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:0014-4819
1432-1106
DOI:10.1007/s00221-008-1664-6