A First-Order-Logic Based Model for Grounded Language Learning

Much is still unknown about how children learn language, but it is clear that they perform “grounded” language learning: they learn the grammar and vocabulary not just from examples of sentences, but from examples of sentences in a particular context. Grounded language learning has been the subject...

Full description

Saved in:
Bibliographic Details
Published inAdvances in Intelligent Data Analysis XIV Vol. 9385; pp. 49 - 60
Main Authors Becerra-Bonache, Leonor, Blockeel, Hendrik, Galván, María, Jacquenet, François
Format Book Chapter
LanguageEnglish
Published Switzerland Springer International Publishing AG 2015
Springer International Publishing
SeriesLecture Notes in Computer Science
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Much is still unknown about how children learn language, but it is clear that they perform “grounded” language learning: they learn the grammar and vocabulary not just from examples of sentences, but from examples of sentences in a particular context. Grounded language learning has been the subject of much research. Most of this work focuses on particular aspects, such as constructing semantic parsers, or on particular types of applications. In this paper, we take a broader view that includes an aspect that has received little attention until now: learning the meaning of phrases from phrase/context pairs in which the phrase’s meaning is not explicitly represented. We propose a simple model for this task that uses first-order logic representations for contexts and meanings, including a simple incremental learning algorithm. We experimentally demonstrate that the proposed model can explain the gradual learning of simple concepts and language structure, and that it can easily be used for interpretation, generation, and translation of phrases.
ISBN:3319244647
9783319244648
ISSN:0302-9743
1611-3349
DOI:10.1007/978-3-319-24465-5_5