A Fuzzy Inference System for a Visually Grounded Robot State of Mind

In order for robots to interact with humans on real-world scenarios or objects, these robots need to construct a representa- tion (‘state of mind’) of these scenarios that a) are grounded in the robots’ perception and b) ideally should match human understand- ing and concepts. Using table-top settin...

Full description

Saved in:
Bibliographic Details
Published inhttps://www.umu.se/en/research/projects/autonomous-systems-ability-to-understand-their-own-limitations p. 2402
Main Authors Singh, Avinash, Baranwal, Neha, Richter, Kai-Florian
Format Conference Proceeding
LanguageEnglish
Published 2020
SeriesFrontiers in Artificial Intelligence and Applications
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In order for robots to interact with humans on real-world scenarios or objects, these robots need to construct a representa- tion (‘state of mind’) of these scenarios that a) are grounded in the robots’ perception and b) ideally should match human understand- ing and concepts. Using table-top settings as scenario, we propose a framework that generates a robot’s ’‘state of mind’ by extracting the objects on the table along with their properties (color, shape and texture) and spatial relations to each other. The scene as perceived by the robot is represented in a dynamic graph in which object at- tributes are encoded as fuzzy linguistic variables that match human spatial concepts. In particular, this paper details the construction of such graph representations by combining low-level neural network- based feature recognition and a high-level fuzzy inference system. Using fuzzy representations allows for easily adapting the robot’s original scene representation to deviations in properties or relations that emerge in language descriptions given by humans viewing the same scene. The framework is implemented on a Pepper humanoid robot and has been evaluated using a data set collected in-house.
ISBN:9781643681009
9781643681016
1643681001
164368101X
DOI:10.3233/FAIA200371