Compensating for Fingertip Size to Render Tactile Cues More Accurately

Fingertip haptic feedback offers advantages in many applications, including robotic teleoperation, gaming, and training. However, fingertip size and shape vary significantly across humans, making it difficult to design fingertip interfaces and rendering techniques suitable for everyone. This article...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on haptics Vol. 13; no. 1; pp. 144 - 151
Main Authors Young, Eric M., Gueorguiev, David, Kuchenbecker, Katherine J., Pacchierotti, Claudio
Format Journal Article
LanguageEnglish
Published United States IEEE 01.01.2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Fingertip haptic feedback offers advantages in many applications, including robotic teleoperation, gaming, and training. However, fingertip size and shape vary significantly across humans, making it difficult to design fingertip interfaces and rendering techniques suitable for everyone. This article starts with an existing data-driven haptic rendering algorithm that ignores fingertip size, and it then develops two software-based approaches to personalize this algorithm for fingertips of different sizes using either additional data or geometry. We evaluate our algorithms in the rendering of pre-recorded tactile sensations onto rubber casts of six different fingertips as well as onto the real fingertips of 13 human participants. Results on the casts show that both approaches significantly improve performance, reducing force error magnitudes by an average of 78% with respect to the standard non-personalized rendering technique. Congruent results were obtained for real fingertips, with subjects rating each of the two personalized rendering techniques significantly better than the standard non-personalized method.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1939-1412
2329-4051
2329-4051
DOI:10.1109/TOH.2020.2966993