A Neural Dynamic Model Generates Descriptions of Object‐Oriented Actions

Describing actions entails that relations between objects are discovered. A pervasively neural account of this process requires that fundamental problems are solved: the neural pointer problem, the binding problem, and the problem of generating discrete processing steps from time‐continuous neural p...

Full description

Saved in:
Bibliographic Details
Published inTopics in cognitive science Vol. 9; no. 1; pp. 35 - 47
Main Authors Richter, Mathis, Lins, Jonas, Schöner, Gregor
Format Journal Article
LanguageEnglish
Published United States Wiley Subscription Services, Inc 01.01.2017
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Describing actions entails that relations between objects are discovered. A pervasively neural account of this process requires that fundamental problems are solved: the neural pointer problem, the binding problem, and the problem of generating discrete processing steps from time‐continuous neural processes. We present a prototypical solution to these problems in a neural dynamic model that comprises dynamic neural fields holding representations close to sensorimotor surfaces as well as dynamic neural nodes holding discrete, language‐like representations. Making the connection between these two types of representations enables the model to describe actions as well as to perceptually ground movement phrases—all based on real visual input. We demonstrate how the dynamic neural processes autonomously generate the processing steps required to describe or ground object‐oriented actions. By solving the fundamental problems of neural pointing, binding, and emergent discrete processing, the model may be a first but critical step toward a systematic neural processing account of higher cognition.
Bibliography:http://onlinelibrary.wiley.com/doi/10.1111/tops.2017.9.issue-1/issuetoc
.
This article is part of the topic “Best of Papers from the Cognitive Science Society Annual Conference,” Wayne D. Gray (Topic Editor). For a full listing of topic papers, see
ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1756-8757
1756-8765
1756-8765
DOI:10.1111/tops.12240