Contextual cuing by global features

In visual search tasks, attention can be guided to a target item--appearing amidst distractors--on the basis of simple features (e.g., finding the red letter among green). Chun and Jiang's (1998) contextual cuing effect shows that reaction times (RTs) are also speeded if the spatial configurati...

Full description

Saved in:
Bibliographic Details
Published inPerception & psychophysics Vol. 68; no. 7; pp. 1204 - 1216
Main Authors KUNAR, Melina A, FLUSBERG, Stephen J, WOLFE, Jeremy M
Format Journal Article
LanguageEnglish
Published Austin, TX Psychonomic Society 01.10.2006
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In visual search tasks, attention can be guided to a target item--appearing amidst distractors--on the basis of simple features (e.g., finding the red letter among green). Chun and Jiang's (1998) contextual cuing effect shows that reaction times (RTs) are also speeded if the spatial configuration of items in a scene is repeated over time. In the present studies, we ask whether global properties of the scene can speed search (e.g., if the display is mostly red, then the target is at location X). In Experiment 1A, the overall background color of the display predicted the target location, and the predictive color could appear 0, 400, or 800 msec in advance of the search array. Mean RTs were faster in predictive than in nonpredictive conditions. However, there was little improvement in search slopes. The global color cue did not improve search efficiency. Experiments 1B-1F replicated this effect using different predictive properties (e.g., background orientation-texture and stimulus color). The results showed a strong RT effect of predictive background, but (at best) only a weak improvement in search efficiency. A strong improvement in efficiency was found, however, when the informative background was presented 1,500 msec prior to the onset of the search stimuli and when observers were given explicit instructions to use the cue (Experiment 2).
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ObjectType-Article-1
ObjectType-Feature-2
ISSN:0031-5117
1532-5962
DOI:10.3758/BF03193721