Recurrent temporal networks and language acquisition-from corticostriatal neurophysiology to reservoir computing
One of the most paradoxical aspects of human language is that it is so unlike any other form of behavior in the animal world, yet at the same time, it has developed in a species that is not far removed from ancestral species that do not possess language. While aspects of non-human primate and avian...
Saved in:
Published in | Frontiers in psychology Vol. 4; p. 500 |
---|---|
Main Author | |
Format | Journal Article |
Language | English |
Published |
Switzerland
Frontiers Media S.A
2013
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | One of the most paradoxical aspects of human language is that it is so unlike any other form of behavior in the animal world, yet at the same time, it has developed in a species that is not far removed from ancestral species that do not possess language. While aspects of non-human primate and avian interaction clearly constitute communication, this communication appears distinct from the rich, combinatorial and abstract quality of human language. So how does the human primate brain allow for language? In an effort to answer this question, a line of research has been developed that attempts to build a language processing capability based in part on the gross neuroanatomy of the corticostriatal system of the human brain. This paper situates this research program in its historical context, that begins with the primate oculomotor system and sensorimotor sequencing, and passes, via recent advances in reservoir computing to provide insight into the open questions, and possible approaches, for future research that attempts to model language processing. One novel and useful idea from this research is that the overlap of cortical projections onto common regions in the striatum allows for adaptive binding of cortical signals from distinct circuits, under the control of dopamine, which has a strong adaptive advantage. A second idea is that recurrent cortical networks with fixed connections can represent arbitrary sequential and temporal structure, which is the basis of the reservoir computing framework. Finally, bringing these notions together, a relatively simple mechanism can be built for learning the grammatical constructions, as the mappings from surface structure of sentences to their meaning. This research suggests that the components of language that link conceptual structure to grammatical structure may be much simpler that has been proposed in other research programs. It also suggests that part of the residual complexity is in the conceptual system itself. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 This article was submitted to Frontiers in Language Sciences, a specialty of Frontiers in Psychology. Edited by: Franklin Chang, University of Liverpool, UK Reviewed by: Matthias Schlesewsky, Johannes Gutenberg University Mainz, Germany; Hartmut Fitz, Max Planck Institute for Psycholinguistics, Netherlands |
ISSN: | 1664-1078 1664-1078 |
DOI: | 10.3389/fpsyg.2013.00500 |