Touch Gesture and Emotion Recognition Using Decomposed Spatiotemporal Convolutions

Touch is one of the most essential and effective means to convey affective feelings and intentions in human communication. For a social robot, the ability to recognize human touch gestures and emotions could help realize efficient and natural human-robot interaction. To this end, an affective touch...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on instrumentation and measurement Vol. 71; pp. 1 - 9
Main Authors Li, Yun-Kai, Meng, Qing-Hao, Yang, Tian-Hao, Wang, Ya-Xin, Hou, Hui-Rang
Format Journal Article
LanguageEnglish
Published New York IEEE 2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
Abstract Touch is one of the most essential and effective means to convey affective feelings and intentions in human communication. For a social robot, the ability to recognize human touch gestures and emotions could help realize efficient and natural human-robot interaction. To this end, an affective touch gesture dataset involving ten kinds of touch gestures and 12 kinds of discrete emotions was built by using a pressure sensor array, in which the acquired touch gesture samples are three-dimensional (3-D) spatiotemporal signals that include the shape appearance and motion dynamics. Due to the excellent performance of convolutional neural networks (CNNs), spatiotemporal CNNs have been effectively verified by researchers for 3-D signal classification. However, the large number of parameters and the high complexity of training 3-D convolution kernels remain to be solved. In this article, a decomposed spatiotemporal convolution was designed for feature representation from the raw touch gesture samples. Specifically, the 3-D kernel was factorized into three 1-D kernels by tensor decomposition. The proposed convolution has a simpler but deeper architecture than standard 3-D convolution, which improves the nonlinear expression ability of the model. Besides, the computation cost can be reduced without compromising recognition accuracy. Using a user-dependent test mode, the proposed method yields the accuracies of up to 92.41% and 72.47% for touch gesture and emotion recognitions, respectively. Experimental results demonstrate the effectiveness of the proposed method, and at the same time, preliminarily verify the feasibility of robot perceiving human emotions through touch.
AbstractList Touch is one of the most essential and effective means to convey affective feelings and intentions in human communication. For a social robot, the ability to recognize human touch gestures and emotions could help realize efficient and natural human-robot interaction. To this end, an affective touch gesture dataset involving ten kinds of touch gestures and 12 kinds of discrete emotions was built by using a pressure sensor array, in which the acquired touch gesture samples are three-dimensional (3-D) spatiotemporal signals that include the shape appearance and motion dynamics. Due to the excellent performance of convolutional neural networks (CNNs), spatiotemporal CNNs have been effectively verified by researchers for 3-D signal classification. However, the large number of parameters and the high complexity of training 3-D convolution kernels remain to be solved. In this article, a decomposed spatiotemporal convolution was designed for feature representation from the raw touch gesture samples. Specifically, the 3-D kernel was factorized into three 1-D kernels by tensor decomposition. The proposed convolution has a simpler but deeper architecture than standard 3-D convolution, which improves the nonlinear expression ability of the model. Besides, the computation cost can be reduced without compromising recognition accuracy. Using a user-dependent test mode, the proposed method yields the accuracies of up to 92.41% and 72.47% for touch gesture and emotion recognitions, respectively. Experimental results demonstrate the effectiveness of the proposed method, and at the same time, preliminarily verify the feasibility of robot perceiving human emotions through touch.
Author Yang, Tian-Hao
Wang, Ya-Xin
Li, Yun-Kai
Hou, Hui-Rang
Meng, Qing-Hao
Author_xml – sequence: 1
  givenname: Yun-Kai
  orcidid: 0000-0001-8152-1403
  surname: Li
  fullname: Li, Yun-Kai
  email: yunkai_li1995@tju.edu.cn
  organization: Tianjin Key Laboratory of Process Measurement and Control, Institute of Robotics and Autonomous Systems, School of Electrical and Information Engineering, Tianjin University, Tianjin, China
– sequence: 2
  givenname: Qing-Hao
  orcidid: 0000-0002-9915-7088
  surname: Meng
  fullname: Meng, Qing-Hao
  email: qh_meng@tju.edu.cn
  organization: Tianjin Key Laboratory of Process Measurement and Control, Institute of Robotics and Autonomous Systems, School of Electrical and Information Engineering, Tianjin University, Tianjin, China
– sequence: 3
  givenname: Tian-Hao
  orcidid: 0000-0003-2640-0531
  surname: Yang
  fullname: Yang, Tian-Hao
  email: 3015203055@tju.edu.cn
  organization: Tianjin Key Laboratory of Process Measurement and Control, Institute of Robotics and Autonomous Systems, School of Electrical and Information Engineering, Tianjin University, Tianjin, China
– sequence: 4
  givenname: Ya-Xin
  orcidid: 0000-0002-8735-2994
  surname: Wang
  fullname: Wang, Ya-Xin
  email: wangyaxin714@163.com
  organization: Tianjin Key Laboratory of Process Measurement and Control, Institute of Robotics and Autonomous Systems, School of Electrical and Information Engineering, Tianjin University, Tianjin, China
– sequence: 5
  givenname: Hui-Rang
  orcidid: 0000-0002-4608-273X
  surname: Hou
  fullname: Hou, Hui-Rang
  email: houhuirang@tju.edu.cn
  organization: Tianjin Key Laboratory of Process Measurement and Control, Institute of Robotics and Autonomous Systems, School of Electrical and Information Engineering, Tianjin University, Tianjin, China
BookMark eNp9UM9LwzAYDTLBOb0LXgqeO_OjSdqjzDkHE2Fu55ClX2dHl8ykFfzvTd3w4MHT9-u97_HeJRpYZwGhG4LHhODifjV_GVNM6ZiRTDKWn6Eh4VymhRB0gIYYkzwtMi4u0GUIO4yxFJkcouXKdeY9mUFoOw-JtmUy3bu2djZZgnFbW__061DbbfIYN_uDC1AmbwcdDy3E0esmmTj76Zqux4YrdF7pJsD1qY7Q-mm6mjyni9fZfPKwSA0tSJvqUhtgcoPZRgJwWWYZw5gyrjc5BQOCV4QzzioCoE0RLVFDpcwpo8AqDGyE7o5_D959dNGA2rnO2yipqGAMZ5IIHlH4iDLeheChUgdf77X_UgSrPjkVk1N9cuqUXKSIPxRTt71d23pdN_8Rb4_EGgB-dQqJBckF-wYFD320
CODEN IEIMAO
CitedBy_id crossref_primary_10_3389_fnins_2023_1216181
crossref_primary_10_1109_JPROC_2023_3272780
crossref_primary_10_1088_2631_8695_acc515
crossref_primary_10_1109_TIM_2024_3373045
crossref_primary_10_1109_TII_2022_3174063
crossref_primary_10_1109_JSEN_2022_3187776
crossref_primary_10_1109_TFUZZ_2024_3373125
Cites_doi 10.1007/s12369-013-0223-x
10.1080/02699939208411068
10.1037/0022-3514.57.3.493
10.1007/s12369-011-0126-7
10.1016/j.neucom.2018.08.042
10.1016/j.ijhcs.2006.11.006
10.1037/a0016108
10.21437/Interspeech.2008-192
10.1126/sciadv.aba4294
10.1109/TII.2018.2862912
10.1016/j.patrec.2014.10.016
10.1109/ICRA.2017.7989267
10.1016/j.engappai.2020.103670
10.1109/TPAMI.2017.2712608
10.1126/scirobotics.aao6760
10.1007/s12193-016-0232-9
10.1109/TIM.2021.3077967
10.1109/CVPR.2018.00716
10.1109/TAFFC.2017.2740923
10.1145/2818346.2830599
10.1145/2663204.2663242
10.1109/ICCV.2015.522
10.1109/TCDS.2018.2809434
10.1155/2018/6973103
10.1109/CVPR.2018.00675
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022
DBID 97E
RIA
RIE
AAYXX
CITATION
7SP
7U5
8FD
L7M
DOI 10.1109/TIM.2022.3147338
DatabaseName IEEE All-Society Periodicals Package (ASPP) 2005–Present
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Xplore
CrossRef
Electronics & Communications Abstracts
Solid State and Superconductivity Abstracts
Technology Research Database
Advanced Technologies Database with Aerospace
DatabaseTitle CrossRef
Solid State and Superconductivity Abstracts
Technology Research Database
Advanced Technologies Database with Aerospace
Electronics & Communications Abstracts
DatabaseTitleList
Solid State and Superconductivity Abstracts
Database_xml – sequence: 1
  dbid: RIE
  name: IEL
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
Physics
EISSN 1557-9662
EndPage 9
ExternalDocumentID 10_1109_TIM_2022_3147338
9706186
Genre orig-research
GrantInformation_xml – fundername: China Postdoctoral Science Foundation
  grantid: 2021M692390
  funderid: 10.13039/501100002858
– fundername: Tianjin Natural Science Foundation
  grantid: 20JCZDJC00150; 20JCYBJC00320
  funderid: 10.13039/501100006606
GroupedDBID -~X
0R~
29I
4.4
5GY
5VS
6IK
85S
8WZ
97E
A6W
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABQJQ
ABVLG
ACGFO
ACIWK
ACNCT
AENEX
AETIX
AGQYO
AGSQL
AHBIQ
AI.
AIBXA
AKJIK
AKQYR
ALLEH
ALMA_UNASSIGNED_HOLDINGS
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CS3
DU5
EBS
EJD
F5P
HZ~
H~9
IAAWW
IBMZZ
ICLAB
IDIHD
IFIPE
IFJZH
IPLJI
JAVBF
LAI
M43
O9-
OCL
P2P
RIA
RIE
RNS
TN5
TWZ
VH1
VJK
AAYOK
AAYXX
CITATION
RIG
7SP
7U5
8FD
L7M
ID FETCH-LOGICAL-c291t-adace37b03b7ee57d44300235ab82ece65f15353f1eeac93382c2778232e3f0e3
IEDL.DBID RIE
ISSN 0018-9456
IngestDate Mon Jun 30 10:07:44 EDT 2025
Thu Apr 24 23:09:04 EDT 2025
Tue Jul 01 03:07:10 EDT 2025
Wed Aug 27 02:49:37 EDT 2025
IsPeerReviewed true
IsScholarly true
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
https://doi.org/10.15223/policy-029
https://doi.org/10.15223/policy-037
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c291t-adace37b03b7ee57d44300235ab82ece65f15353f1eeac93382c2778232e3f0e3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ORCID 0000-0002-4608-273X
0000-0003-2640-0531
0000-0001-8152-1403
0000-0002-9915-7088
0000-0002-8735-2994
PQID 2633047165
PQPubID 85462
PageCount 9
ParticipantIDs proquest_journals_2633047165
crossref_citationtrail_10_1109_TIM_2022_3147338
ieee_primary_9706186
crossref_primary_10_1109_TIM_2022_3147338
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 20220000
2022-00-00
20220101
PublicationDateYYYYMMDD 2022-01-01
PublicationDate_xml – year: 2022
  text: 20220000
PublicationDecade 2020
PublicationPlace New York
PublicationPlace_xml – name: New York
PublicationTitle IEEE transactions on instrumentation and measurement
PublicationTitleAbbrev TIM
PublicationYear 2022
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref13
ref12
ref15
ref14
ref11
ref10
ref2
ref1
ref17
ref16
ref19
ref18
ref24
ref23
ref26
ref20
ref22
ref21
Howard (ref25) 2017
ref8
ref7
ref9
ref4
ref3
ref6
van der Maaten (ref27) 2008; 9
ref5
References_xml – ident: ref17
  doi: 10.1007/s12369-013-0223-x
– ident: ref19
  doi: 10.1080/02699939208411068
– ident: ref18
  doi: 10.1037/0022-3514.57.3.493
– ident: ref4
  doi: 10.1007/s12369-011-0126-7
– ident: ref14
  doi: 10.1016/j.neucom.2018.08.042
– ident: ref23
  doi: 10.1016/j.ijhcs.2006.11.006
– ident: ref2
  doi: 10.1037/a0016108
– ident: ref22
  doi: 10.21437/Interspeech.2008-192
– volume: 9
  start-page: 2579
  year: 2008
  ident: ref27
  article-title: Visualizing data using t-SNE
  publication-title: J. Mach. Learn. Res.
– ident: ref8
  doi: 10.1126/sciadv.aba4294
– ident: ref16
  doi: 10.1109/TII.2018.2862912
– ident: ref11
  doi: 10.1016/j.patrec.2014.10.016
– ident: ref13
  doi: 10.1109/ICRA.2017.7989267
– ident: ref7
  doi: 10.1016/j.engappai.2020.103670
– ident: ref21
  doi: 10.1109/TPAMI.2017.2712608
– ident: ref1
  doi: 10.1126/scirobotics.aao6760
– ident: ref9
  doi: 10.1007/s12193-016-0232-9
– ident: ref6
  doi: 10.1109/TIM.2021.3077967
– ident: ref26
  doi: 10.1109/CVPR.2018.00716
– ident: ref24
  doi: 10.1109/TAFFC.2017.2740923
– ident: ref10
  doi: 10.1145/2818346.2830599
– year: 2017
  ident: ref25
  article-title: MobileNets: Efficient convolutional neural networks for mobile vision applications
  publication-title: arXiv:1704.04861
– ident: ref5
  doi: 10.1145/2663204.2663242
– ident: ref20
  doi: 10.1109/ICCV.2015.522
– ident: ref3
  doi: 10.1109/TCDS.2018.2809434
– ident: ref12
  doi: 10.1155/2018/6973103
– ident: ref15
  doi: 10.1109/CVPR.2018.00675
SSID ssj0007647
Score 2.4072971
Snippet Touch is one of the most essential and effective means to convey affective feelings and intentions in human communication. For a social robot, the ability to...
SourceID proquest
crossref
ieee
SourceType Aggregation Database
Enrichment Source
Index Database
Publisher
StartPage 1
SubjectTerms Accuracy
Artificial neural networks
Decomposed spatiotemporal convolution
Decomposition
Emotion recognition
Emotions
Human communication
Human performance
human–robot tactile interaction
Kernels
Pressure sensors
Robots
Sensor arrays
Service robots
Signal classification
Spatiotemporal phenomena
Tensors
Three-dimensional displays
touch gesture recognition
Title Touch Gesture and Emotion Recognition Using Decomposed Spatiotemporal Convolutions
URI https://ieeexplore.ieee.org/document/9706186
https://www.proquest.com/docview/2633047165
Volume 71
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LS8QwEB5UEPTgW1xf5OBFsLtpkrb2KL6F9aAreCtJOkVQWnG3Hvz1TtJ2ERXxlkMSwkwymUm--QbgIKRbB49NGKSmSAMVcR6YUOpAWh3aY8N1yF2-8_A2vnpQN4_R4wwcTXNhENGDz7Dvmv4vP69s7Z7KBmnCHb37LMxS4Nbkak2tbhKrhh8zpANMXkH3JcnTweh6SIGgEBSfqkS6TJQvV5CvqfLDEPvb5WIZht26GlDJc7-emL79-EbZ-N-Fr8BS62ayk2ZfrMIMlmuw-IV8cA3mPfjTjtfhblTV9old0rrqN2S6zNl5U92H3XX4Imp7dAE7Q4dCr8aYs3uPxm7JrV7YaVW-dxt5Ax4uzkenV0FbayGwIg0ngc61RZkYLk2CGCW5UtJz4WhDyrQYRwXZxkgWIZKpTkmKwoqEFC0FyoKj3IS5sipxC5hFcrIMeTpKFUrlqYkwFzbWWmgVU6sHg078mW2JyF09jJfMByQ8zUhhmVNY1iqsB4fTEa8NCccffded_Kf9WtH3YLfTcNae0nEmYveaQxFjtP37qB1YcHM3Ty67MDd5q3GPnJCJ2fe77xNRjNi-
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LT9wwEB7RrarCgbY8xAKlPvRSqdl1bCfZHBGPLpTlQBeJW2Q7EyGBEsRuOPDrGTvJClGEuPngKNaMPQ_7m28AfobkdXBkwiA1RRqoiPPAhFIH0urQjgzXIXf1zpPzeHypTq-iqyX4vaiFQUQPPsOBG_q3_LyytbsqG6YJd_TuH-Aj-f1INNVaC7ubxKphyAzpCFNc0D1K8nQ4PZlQKigEZagqka4W5ZkT8l1V_jPF3r8cf4FJt7IGVnIzqOdmYB9fkDa-d-lfYbUNNNl-szO-wRKWa7DyjH5wDT55-KedrcPFtKrtNftD66rvkekyZ0dNfx920SGMaOzxBewQHQ69mmHO_nk8dktvdcsOqvKh28obcHl8ND0YB223hcCKNJwHOtcWZWK4NAlilORKSc-Gow2p02IcFWQdI1mESMY6JSkKKxJStRQoC45yE3plVeIWMIsUZhmKdZQqlMpTE2EubKy10CqmUR-Gnfgz21KRu44Yt5lPSXiakcIyp7CsVVgffi2-uGtoON6Yu-7kv5jXir4Pu52Gs_aczjIRu_scyhmj7de_-gGfx9PJWXZ2cv53B5bdf5oLmF3oze9r_E4hydzs-Z34BMvk3Ag
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Touch+Gesture+and+Emotion+Recognition+Using+Decomposed+Spatiotemporal+Convolutions&rft.jtitle=IEEE+transactions+on+instrumentation+and+measurement&rft.au=Li%2C+Yun-Kai&rft.au=Meng%2C+Qing-Hao&rft.au=Yang%2C+Tian-Hao&rft.au=Wang%2C+Ya-Xin&rft.date=2022&rft.pub=IEEE&rft.issn=0018-9456&rft.volume=71&rft.spage=1&rft.epage=9&rft_id=info:doi/10.1109%2FTIM.2022.3147338&rft.externalDocID=9706186
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0018-9456&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0018-9456&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0018-9456&client=summon