The combination of vision and touch depends on spatial proximity

The nervous system often combines visual and haptic information about object properties such that the combined estimate is more precise than with vision or haptics alone. We examined how the system determines when to combine the signals. Presumably, signals should not be combined when they come from...

Full description

Saved in:
Bibliographic Details
Published inJournal of vision (Charlottesville, Va.) Vol. 5; no. 11; pp. 1013 - 1023
Main Authors Gepshtein, Sergei, Burge, Johannes, Ernst, Marc O, Banks, Martin S
Format Journal Article
LanguageEnglish
Published United States 28.12.2005
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The nervous system often combines visual and haptic information about object properties such that the combined estimate is more precise than with vision or haptics alone. We examined how the system determines when to combine the signals. Presumably, signals should not be combined when they come from different objects. The likelihood that signals come from different objects is highly correlated with the spatial separation between the signals, so we asked how the spatial separation between visual and haptic signals affects their combination. To do this, we first created conditions for each observer in which the effect of combination--the increase in discrimination precision with two modalities relative to performance with one modality--should be maximal. Then under these conditions, we presented visual and haptic stimuli separated by different spatial distances and compared human performance with predictions of a model that combined signals optimally. We found that discrimination precision was essentially optimal when the signals came from the same location, and that discrimination precision was poorer when the signals came from different locations. Thus, the mechanism of visual-haptic combination is specialized for signals that coincide in space.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ISSN:1534-7362
1534-7362
DOI:10.1167/5.11.7