Syntactic systematicity arising from semantic predictions in a Hebbian-competitive network

A Hebbian-inspired, competitive network is presented which learns to predict the typical semantic features of denoting terms in simple and moderately complex sentences. In addition, the network learns to predict the appearance of syntactically key words, such as prepositions and relative pronouns. I...

Full description

Saved in:
Bibliographic Details
Published inConnection science Vol. 13; no. 1; pp. 73 - 94
Main Authors Hadley, Robert F, Rotaru-Varga, Adam, Arnold, Dirk V, Cardei, Vlad C
Format Journal Article
LanguageEnglish
Published London Taylor & Francis Group 01.03.2001
Taylor and Francis
Taylor & Francis Ltd
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:A Hebbian-inspired, competitive network is presented which learns to predict the typical semantic features of denoting terms in simple and moderately complex sentences. In addition, the network learns to predict the appearance of syntactically key words, such as prepositions and relative pronouns. Importantly, as a by-product of the network's semantic training, a strong form of syntactic systematicity emerges. This systematicity is exhibited even at a novel, deeper level of clausal embedding. All network training is unsupervised with respect to error feedback. A novel variant of competitive learning and an unusual hierarchical architecture are presented. The relationship of this work to issues raised by Marcus and Phillips is explored.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:0954-0091
1360-0494
DOI:10.1080/09540090110052996