Rapid Word Learning under Uncertainty via Cross-Situational Statistics

There are an infinite number of possible word-to-word pairings in naturalistic learning environments. Previous proposals to solve this mapping problem have focused on linguistic, social, representational, and attentional constraints at a single moment. This article discusses a cross-situational lear...

Full description

Saved in:
Bibliographic Details
Published inPsychological science Vol. 18; no. 5; pp. 414 - 420
Main Authors Yu, Chen, Smith, Linda B.
Format Journal Article
LanguageEnglish
Published Los Angeles, CA Blackwell Publishing 01.05.2007
SAGE Publications
SAGE PUBLICATIONS, INC
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:There are an infinite number of possible word-to-word pairings in naturalistic learning environments. Previous proposals to solve this mapping problem have focused on linguistic, social, representational, and attentional constraints at a single moment. This article discusses a cross-situational learning strategy based on computing distributional statistics across words, across referents, and, most important, across the co-occurrences of words and referents at multiple moments. We briefly exposed adults to a set of trials that each contained multiple spoken words and multiple pictures of individual objects; no information about word-picture correspondences was given within a trial. Nonetheless, over trials, subjects learned the word-picture mappings through cross-trial statistical relations. Different learning conditions varied the degree of within-trial reference uncertainty, the number of trials, and the length of trials. Overall, the remarkable performance of learners in various learning conditions suggests that they calculate cross-trial statistics with sufficient fidelity and by doing so rapidly learn word-referent pairs even in highly ambiguous learning contexts.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ObjectType-Article-2
ObjectType-Feature-1
ISSN:0956-7976
1467-9280
DOI:10.1111/j.1467-9280.2007.01915.x