Dual Leap Motion Controller 2: A Robust Dataset for Multi-view Hand Pose Recognition

This paper presents Multi-view Leap2 Hand Pose Dataset (ML2HP Dataset), a new dataset for hand pose recognition, captured using a multi-view recording setup with two Leap Motion Controller 2 devices. This dataset encompasses a diverse range of hand poses, recorded from different angles to ensure com...

Full description

Saved in:
Bibliographic Details
Published inScientific data Vol. 11; no. 1; pp. 1102 - 14
Main Authors Gil-Martín, Manuel, Marini, Marco Raoul, San-Segundo, Rubén, Cinque, Luigi
Format Journal Article
LanguageEnglish
Published London Nature Publishing Group UK 09.10.2024
Nature Publishing Group
Nature Portfolio
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This paper presents Multi-view Leap2 Hand Pose Dataset (ML2HP Dataset), a new dataset for hand pose recognition, captured using a multi-view recording setup with two Leap Motion Controller 2 devices. This dataset encompasses a diverse range of hand poses, recorded from different angles to ensure comprehensive coverage. The dataset includes real images with the associated precise and automatic hand properties, such as landmark coordinates, velocities, orientations, and finger widths. This dataset has been meticulously designed and curated to maintain a balance in terms of subjects, hand poses, and the usage of right or left hand, ensuring fairness and parity. The content includes 714,000 instances from 21 subjects of 17 different hand poses (including real images and 247 associated hand properties). The multi-view setup is necessary to mitigate hand occlusion phenomena, ensuring continuous tracking and pose estimation required in real human-computer interaction applications. This dataset contributes to advancing the field of multimodal hand pose recognition by providing a valuable resource for developing advanced artificial intelligence human computer interfaces.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ObjectType-Undefined-3
ISSN:2052-4463
2052-4463
DOI:10.1038/s41597-024-03968-9