Touch Gesture Recognition System based on 1D Convolutional Neural Network with Two Touch Sensor Orientation Settings

Touch is regarded as an important channel in human-robot interaction. This paper presents a touch gesture recognition system that can be applied to hard-skinned robots. Related studies have been based on traditional machine learning methods with hand-crafted features that make it difficult for devel...

Full description

Saved in:
Bibliographic Details
Published in2019 16th International Conference on Ubiquitous Robots (UR) pp. 65 - 70
Main Authors Park, Joo-Hye, Seo, Ju-Hwan, Nho, Young-Hoon, Kwon, Dong-Soo
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.06.2019
Subjects
Online AccessGet full text

Cover

Loading…
Abstract Touch is regarded as an important channel in human-robot interaction. This paper presents a touch gesture recognition system that can be applied to hard-skinned robots. Related studies have been based on traditional machine learning methods with hand-crafted features that make it difficult for developers to access optimal features that they cannot imagine. To prevent this, our proposed touch gesture recognition system uses a 1D convolutional neural network (1D CNN) that can learn features from data directly. The recognition system classifies four touch patterns: hit, pat, push, and rub. The results show an average recognition rate of 90.5%, which is higher than one of the related studies. Additionally, we verify the effect of touch sensor orientation on recognition performance. Many studies achieved accuracy with a touch sensor installed in only one orientation. In this study, we experimentally confirm that a classifier trained with data from a vertically installed touch sensor shows degraded performance on test data from a horizontally installed touch sensor, and vice versa. To achieve high recognition accuracy for both orientations, the network is newly trained with data from both vertically and horizontally installed sensors. The results show an 88.5% and 89.1% accuracy rate for the vertical and horizontal test data, respectively. That is, the model achieves reliable performance in both orientations whereas classifiers trained with data from a certain orientation cannot show good performance on test data from the different orientation.
AbstractList Touch is regarded as an important channel in human-robot interaction. This paper presents a touch gesture recognition system that can be applied to hard-skinned robots. Related studies have been based on traditional machine learning methods with hand-crafted features that make it difficult for developers to access optimal features that they cannot imagine. To prevent this, our proposed touch gesture recognition system uses a 1D convolutional neural network (1D CNN) that can learn features from data directly. The recognition system classifies four touch patterns: hit, pat, push, and rub. The results show an average recognition rate of 90.5%, which is higher than one of the related studies. Additionally, we verify the effect of touch sensor orientation on recognition performance. Many studies achieved accuracy with a touch sensor installed in only one orientation. In this study, we experimentally confirm that a classifier trained with data from a vertically installed touch sensor shows degraded performance on test data from a horizontally installed touch sensor, and vice versa. To achieve high recognition accuracy for both orientations, the network is newly trained with data from both vertically and horizontally installed sensors. The results show an 88.5% and 89.1% accuracy rate for the vertical and horizontal test data, respectively. That is, the model achieves reliable performance in both orientations whereas classifiers trained with data from a certain orientation cannot show good performance on test data from the different orientation.
Author Seo, Ju-Hwan
Nho, Young-Hoon
Kwon, Dong-Soo
Park, Joo-Hye
Author_xml – sequence: 1
  givenname: Joo-Hye
  surname: Park
  fullname: Park, Joo-Hye
  organization: Human-Robot Interaction Research Center, Korea Advanced Institute of Science and Technology, Daejeon, 34141, South Korea
– sequence: 2
  givenname: Ju-Hwan
  surname: Seo
  fullname: Seo, Ju-Hwan
  organization: Human-Robot Interaction Research Center, Korea Advanced Institute of Science and Technology, Daejeon, 34141, South Korea
– sequence: 3
  givenname: Young-Hoon
  surname: Nho
  fullname: Nho, Young-Hoon
  organization: Human-Robot Interaction Research Center, Korea Advanced Institute of Science and Technology, Daejeon, 34141, South Korea
– sequence: 4
  givenname: Dong-Soo
  surname: Kwon
  fullname: Kwon, Dong-Soo
  organization: Faculty of Mechanical Engineering, Korea Advanced Institute of Science and Technology, Daejeon, 34141, South Korea
BookMark eNotkM1Kw0AcxFfQg60-gHjZF0jMfmWTY4laC8VCm57LJvtPu5juymZj6NsbTU_DMMMPZmbo1joLCD2RJCYkyV_228UqpgnJ40ymWcqyGzQjkmaEUUaTexRK19cnvIQu9B7wFmp3tCYYZ_Hu0gU440p1oPHoySsunP1xbf8XqxZ_Qu__JQzOf-HBhBMuB4cn5A5s5zzeeAM2qIkIIRh77B7QXaPaDh6vOkf797ey-IjWm-WqWKwjQ6QIkWpYIlQltBCCAIiKq4wDNJQpTpWUjEs1FqtGcaE1MF3rCrjOxuEszalkc_Q8cQ0AHL69OSt_OVyPYL9n_1nj
ContentType Conference Proceeding
DBID 6IE
6IL
CBEJK
RIE
RIL
DOI 10.1109/URAI.2019.8768638
DatabaseName IEEE Electronic Library (IEL) Conference Proceedings
IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume
IEEE Xplore All Conference Proceedings
IEEE Electronic Library Online
IEEE Proceedings Order Plans (POP All) 1998-Present
DatabaseTitleList
Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Electronic Library Online
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
EISBN 1728132320
9781728132327
EndPage 70
ExternalDocumentID 8768638
Genre orig-research
GroupedDBID 6IE
6IL
CBEJK
RIE
RIL
ID FETCH-LOGICAL-i175t-af305ab5d5551ee5b4a84eef23a42a77347a175bfa45dde3dcdbe4d8110369273
IEDL.DBID RIE
IngestDate Thu Jun 29 18:39:19 EDT 2023
IsPeerReviewed false
IsScholarly false
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-i175t-af305ab5d5551ee5b4a84eef23a42a77347a175bfa45dde3dcdbe4d8110369273
PageCount 6
ParticipantIDs ieee_primary_8768638
PublicationCentury 2000
PublicationDate 2019-June
PublicationDateYYYYMMDD 2019-06-01
PublicationDate_xml – month: 06
  year: 2019
  text: 2019-June
PublicationDecade 2010
PublicationTitle 2019 16th International Conference on Ubiquitous Robots (UR)
PublicationTitleAbbrev URAI
PublicationYear 2019
Publisher IEEE
Publisher_xml – name: IEEE
Score 1.7415993
Snippet Touch is regarded as an important channel in human-robot interaction. This paper presents a touch gesture recognition system that can be applied to...
SourceID ieee
SourceType Publisher
StartPage 65
SubjectTerms Convolution
Gesture recognition
Human-robot interaction
Tactile sensors
Title Touch Gesture Recognition System based on 1D Convolutional Neural Network with Two Touch Sensor Orientation Settings
URI https://ieeexplore.ieee.org/document/8768638
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LT8MwDI62nTgB2hBv5cCRbm2T9HFEwBhIAzQ2abcpaVwxgdqptCDx63HSbgjEgVMfSt3KTmMn-T6bkDPPYziv4NJhIF2Hc5k6cZi6TuimzJWBgpgZ7vD4PhjN-N1czFvkfMOFAQALPoO-ObV7-TpPKrNUNsA_N8L-0ibtMI5rrlazUem58WA2ubg1WC00ft3uR8EU6y-G22S8flMNE3npV6XqJ5-_kjD-91N2SO-bmUcfNz5nl7Qg65JymlfJM73BAb4qgE7WmKA8o3VCcmp8laZ47V1RlPTe9Df5Sk1yDnuwaHBqlmXp9COntcgnnOTmBX0olg1FCSWCRUq_9chseD29HDlNNQVniSFC6UhUvpBKaIFBEoBQXEYcIPWZ5L4MQ8ZDiQ1VKrnAMY_pRCvgOkLdsiDGKGePdLI8g31CXc6Am9plWuNTYRrxKPFjzhgEKhLCPyBdo7HFqk6YsWiUdfj37SOyZaxW46-OSacsKjhBT1-qU2viLwa0rJ4
link.rule.ids 310,311,783,787,792,793,799,27937,55086
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LT8MwDI7GOMAJ0IZ4kwNHurVN0scRAWODbaDRSbtNSeuKCdSi0oLEr8dpuyEQB059KHUrO42d5PtsQs4si-G8gkuDgTQNzmVs-G5sGq4ZM1M6CnymucOjsdOf8tuZmDXI-YoLAwAl-Aw6-rTcy4_SsNBLZV38cz3sL2tkHeNqz6nYWvVWpWX63enkYqDRWmj-quWPkimlx-htkdHyXRVQ5LlT5KoTfv5Kw_jfj9km7W9uHn1YeZ0d0oCkRfIgLcIneoNDfJEBnSxRQWlCq5TkVHuriOK1dUVR0nvd4-QL1ek5ykOJB6d6YZYGHymtRD7iNDfN6H22qElKKBFKrPRbm0x718Fl36jrKRgLDBJyQ6L6hVQiEhgmAQjFpccBYptJbkvXZdyV2FDFkgsc9VgURgp45KFumeNjnLNLmkmawB6hJmfAdfWyKMKn3NjjXmj7nDFwlCeEvU9aWmPz1yplxrxW1sHft0_JRj8YDefDwfjukGxqC1ZorCPSzLMCjtHv5-qkNPcXNnmv6Q
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=proceeding&rft.title=2019+16th+International+Conference+on+Ubiquitous+Robots+%28UR%29&rft.atitle=Touch+Gesture+Recognition+System+based+on+1D+Convolutional+Neural+Network+with+Two+Touch+Sensor+Orientation+Settings&rft.au=Park%2C+Joo-Hye&rft.au=Seo%2C+Ju-Hwan&rft.au=Nho%2C+Young-Hoon&rft.au=Kwon%2C+Dong-Soo&rft.date=2019-06-01&rft.pub=IEEE&rft.spage=65&rft.epage=70&rft_id=info:doi/10.1109%2FURAI.2019.8768638&rft.externalDocID=8768638