Similarity-Aware Skill Reproduction based on Multi-Representational Learning from Demonstration

Learning from Demonstration (LfD) algorithms enable humans to teach new skills to robots through demonstrations. The learned skills can be robustly reproduced from the identical or near boundary conditions (e.g., initial point). However, when generalizing a learned skill over boundary conditions wit...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Hertel, Brendan, Ahmadzadeh, S Reza
Format Paper Journal Article
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 28.06.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Learning from Demonstration (LfD) algorithms enable humans to teach new skills to robots through demonstrations. The learned skills can be robustly reproduced from the identical or near boundary conditions (e.g., initial point). However, when generalizing a learned skill over boundary conditions with higher variance, the similarity of the reproductions changes from one boundary condition to another, and a single LfD representation cannot preserve a consistent similarity across a generalization region. We propose a novel similarity-aware framework including multiple LfD representations and a similarity metric that can improve skill generalization by finding reproductions with the highest similarity values for a given boundary condition. Given a demonstration of the skill, our framework constructs a similarity region around a point of interest (e.g., initial point) by evaluating individual LfD representations using the similarity metric. Any point within this volume corresponds to a representation that reproduces the skill with the greatest similarity. We validate our multi-representational framework in three simulated and four sets of real-world experiments using a physical 6-DOF robot. We also evaluate 11 different similarity metrics and categorize them according to their biases in 286 simulated experiments.
ISSN:2331-8422
DOI:10.48550/arxiv.2110.14817