Online Robot Teaching With Natural Human-Robot Interaction

With the development of Industry 4.0, robots tend to be intelligent and collaborative. For one, robots can interact naturally with humans. For another, robots can work collaboratively with humans in a common area. The traditional teaching method is no longer suitable for the production mode with hum...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on industrial electronics (1982) Vol. 65; no. 12; pp. 9571 - 9581
Main Authors Du, Guanglong, Chen, Mingxuan, Liu, Caibing, Zhang, Bo, Zhang, Ping
Format Journal Article
LanguageEnglish
Published New York IEEE 01.12.2018
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
Abstract With the development of Industry 4.0, robots tend to be intelligent and collaborative. For one, robots can interact naturally with humans. For another, robots can work collaboratively with humans in a common area. The traditional teaching method is no longer suitable for the production mode with human-robot collaboration. Since the traditional teaching processes are complicated, they need highly skilled staffs. This paper focuses on the natural way of online teaching, which can be applied to the tasks such as welding, painting, and stamping. This paper presents an online teaching method with the fusion of speech and gesture. A depth camera (Kinect) and an inertial measurement unit are used to capture the speech and gesture of the human. Interval Kalman filter and improved particle filter are employed to estimate the gesture. To integrate speech and gesture information more deeply, a novel method of audio-visual fusion based on text is proposed, which can extract the most useful information from the speech and gestures by transforming them into text. Finally, a maximum entropy algorithm is employed to deal with the fusion text into the corresponding robot instructions. The practicality and effectiveness of the proposed approach were validated by five subjects without robot teaching skills. The results indicate that the online robot teaching system can successfully teach robot manipulators.
AbstractList With the development of Industry 4.0, robots tend to be intelligent and collaborative. For one, robots can interact naturally with humans. For another, robots can work collaboratively with humans in a common area. The traditional teaching method is no longer suitable for the production mode with human-robot collaboration. Since the traditional teaching processes are complicated, they need highly skilled staffs. This paper focuses on the natural way of online teaching, which can be applied to the tasks such as welding, painting, and stamping. This paper presents an online teaching method with the fusion of speech and gesture. A depth camera (Kinect) and an inertial measurement unit are used to capture the speech and gesture of the human. Interval Kalman filter and improved particle filter are employed to estimate the gesture. To integrate speech and gesture information more deeply, a novel method of audio-visual fusion based on text is proposed, which can extract the most useful information from the speech and gestures by transforming them into text. Finally, a maximum entropy algorithm is employed to deal with the fusion text into the corresponding robot instructions. The practicality and effectiveness of the proposed approach were validated by five subjects without robot teaching skills. The results indicate that the online robot teaching system can successfully teach robot manipulators.
Author Chen, Mingxuan
Du, Guanglong
Liu, Caibing
Zhang, Ping
Zhang, Bo
Author_xml – sequence: 1
  givenname: Guanglong
  orcidid: 0000-0001-9425-843X
  surname: Du
  fullname: Du, Guanglong
  email: csgldu@scut.edu.cn
  organization: School of Computer Science and Engineering, South China University of Technology, Guangzhou, China
– sequence: 2
  givenname: Mingxuan
  surname: Chen
  fullname: Chen, Mingxuan
  email: 317460580@qq.com
  organization: School of Computer Science and Engineering, South China University of Technology, Guangzhou, China
– sequence: 3
  givenname: Caibing
  surname: Liu
  fullname: Liu, Caibing
  email: 1044083971@qq.com
  organization: School of Computer Science and Engineering, South China University of Technology, Guangzhou, China
– sequence: 4
  givenname: Bo
  surname: Zhang
  fullname: Zhang, Bo
  email: 550510024@qq.com
  organization: School of Computer Science and Engineering, South China University of Technology, Guangzhou, China
– sequence: 5
  givenname: Ping
  surname: Zhang
  fullname: Zhang, Ping
  email: pzhang@scut.edu.cn
  organization: School of Computer Science and Engineering, South China University of Technology, Guangzhou, China
BookMark eNp9kE1LAzEQhoNUsK3eBS8LnrfmY7NJvEmptlAsSMVjSNNZm7JNajZ78N-7ZYsHD57m8j7zzjwjNPDBA0K3BE8IwephvZhNKCZyQiVlZSku0JBwLnKlCjlAQ0yFzDEuyis0apo9xqTghA_R48rXzkP2FjYhZWswduf8Z_bh0i57NamNps7m7cH4vE8sfIJobHLBX6PLytQN3JznGL0_z9bTeb5cvSymT8vcUkVSzik3YBiRSnZ3sa0lsqiEwYIxbMEKRW1VboCbLaeAqQIhgW1ZobCpKstKNkb3_d5jDF8tNEnvQxt9V6kplkRxQZToUrhP2RiaJkKlj9EdTPzWBOuTId0Z0idD-myoQ8o_iHXJnF5L0bj6P_CuBx0A_PZI1j3JOfsB_nZzmQ
CODEN ITIED6
CitedBy_id crossref_primary_10_1109_TIE_2021_3073310
crossref_primary_10_1108_IR_09_2019_0180
crossref_primary_10_1109_ACCESS_2019_2948880
crossref_primary_10_1109_TIE_2022_3159913
crossref_primary_10_1109_TMECH_2023_3241302
crossref_primary_10_1016_j_rcim_2022_102484
crossref_primary_10_1007_s12206_023_0639_7
crossref_primary_10_4018_IJOPCD_302082
crossref_primary_10_1109_TIE_2020_3013791
crossref_primary_10_1145_3570731
crossref_primary_10_3390_bioengineering10030326
crossref_primary_10_1016_j_rcim_2023_102643
crossref_primary_10_1109_TIE_2022_3196372
crossref_primary_10_1016_j_matpr_2021_03_503
crossref_primary_10_35234_fumbd_557590
crossref_primary_10_1109_TASE_2023_3345868
crossref_primary_10_1109_TIE_2022_3150097
crossref_primary_10_1109_TMECH_2021_3112722
crossref_primary_10_1016_j_rcim_2023_102596
crossref_primary_10_1115_1_4053806
crossref_primary_10_1007_s00146_024_01889_0
crossref_primary_10_1016_j_rcim_2021_102238
crossref_primary_10_1109_TMECH_2019_2945135
crossref_primary_10_1142_S0219843621500079
crossref_primary_10_25100_iyc_v25iSuplemento_13133
crossref_primary_10_3390_electronics12163380
crossref_primary_10_3390_s21175871
crossref_primary_10_1108_IR_02_2020_0039
crossref_primary_10_3390_machines10020134
crossref_primary_10_1155_2021_6457317
crossref_primary_10_1109_TII_2019_2917392
crossref_primary_10_1007_s00500_023_08346_y
crossref_primary_10_1016_j_vrih_2020_05_006
crossref_primary_10_1109_TMECH_2024_3382984
crossref_primary_10_3390_s21041209
crossref_primary_10_1007_s00170_024_13208_4
crossref_primary_10_3390_machines12110756
crossref_primary_10_3390_app11219789
crossref_primary_10_1016_j_ergon_2022_103376
crossref_primary_10_1109_TASE_2019_2895507
crossref_primary_10_1007_s00170_024_13045_5
crossref_primary_10_1080_19368623_2024_2337798
crossref_primary_10_1002_advs_202100230
crossref_primary_10_1109_TMECH_2021_3064581
crossref_primary_10_1109_TIE_2020_2973893
crossref_primary_10_1109_TIE_2024_3395792
crossref_primary_10_1016_j_bspc_2025_107632
crossref_primary_10_1016_j_engappai_2020_103903
crossref_primary_10_3390_app12157600
crossref_primary_10_1038_s41467_022_32702_5
crossref_primary_10_1007_s11432_019_2749_0
crossref_primary_10_2478_amns_2024_0557
crossref_primary_10_1109_TIE_2022_3232669
Cites_doi 10.1007/s11431-010-4054-9
10.3390/s140406012
10.1177/0278364915627291
10.1109/TAC.2017.2720970
10.1109/TII.2016.2526674
10.1109/TFUZZ.2017.2710952
10.1109/TIE.2014.2301728
10.1016/j.cviu.2016.03.015
10.1109/LRA.2017.2678546
10.4028/www.scientific.net/AMM.330.648
10.3390/s120608301
10.1007/s40430-016-0662-z
10.1016/j.robot.2015.03.002
10.1016/j.robot.2016.08.012
10.1016/j.patrec.2014.11.010
10.1002/asjc.1052
10.1109/TBME.2014.2303052
10.1109/TSMC.2017.2720968
10.3115/990820.990874
10.1109/TCYB.2016.2581220
10.1109/LRA.2016.2531124
10.1109/TFUZZ.2017.2686352
10.1108/IR-04-2015-0065
10.1016/j.robot.2015.08.005
10.1108/01439911011018911
10.1109/HUMANOIDS.2014.7041495
10.1007/s11771-012-1033-2
10.1109/TIE.2009.2032431
10.1109/TSMC.2015.2461186
10.1109/TFUZZ.2016.2617378
10.1109/TIE.2008.2010166
10.1109/TIM.2007.911646
10.1109/MAES.2013.6642828
10.1109/ISIE.2007.4374828
10.1016/j.csl.2015.11.002
10.1016/j.ins.2013.12.042
10.1016/j.eswa.2015.01.016
10.1016/j.jvcir.2016.05.020
10.3390/s130912406
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2018
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2018
DBID 97E
RIA
RIE
AAYXX
CITATION
7SP
8FD
L7M
DOI 10.1109/TIE.2018.2823667
DatabaseName IEEE Xplore (IEEE)
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
Electronics & Communications Abstracts
Technology Research Database
Advanced Technologies Database with Aerospace
DatabaseTitle CrossRef
Technology Research Database
Advanced Technologies Database with Aerospace
Electronics & Communications Abstracts
DatabaseTitleList Technology Research Database

Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
Education
EISSN 1557-9948
EndPage 9581
ExternalDocumentID 10_1109_TIE_2018_2823667
8331855
Genre orig-research
GrantInformation_xml – fundername: Guangdong Special Projects
  grantid: 2016TQ03X824
– fundername: National Natural Science Foundation of China
  grantid: 61602182
– fundername: Pearl River S&T Nova Program of Guangzhou
  grantid: 201710010059
– fundername: Guangdong Natural Science Funds for Distinguished Young Scholar
  grantid: 2017A030306015
– fundername: Fundamental Research Funds for the Central Universities
  grantid: 2017JQ009
GroupedDBID -~X
.DC
0R~
29I
4.4
5GY
5VS
6IK
97E
9M8
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABQJQ
ABVLG
ACGFO
ACGFS
ACIWK
ACKIV
ACNCT
AENEX
AETIX
AGQYO
AGSQL
AHBIQ
AI.
AIBXA
AKJIK
AKQYR
ALLEH
ALMA_UNASSIGNED_HOLDINGS
ASUFR
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CS3
DU5
EBS
EJD
HZ~
H~9
IBMZZ
ICLAB
IFIPE
IFJZH
IPLJI
JAVBF
LAI
M43
MS~
O9-
OCL
P2P
RIA
RIE
RNS
TAE
TN5
TWZ
VH1
VJK
AAYXX
CITATION
RIG
7SP
8FD
L7M
ID FETCH-LOGICAL-c291t-525aea318983663dc184f7a07330cec792cf6be5ad52e029e78e3d3490affc363
IEDL.DBID RIE
ISSN 0278-0046
IngestDate Mon Jun 30 10:10:48 EDT 2025
Thu Apr 24 23:07:12 EDT 2025
Tue Jul 01 00:16:26 EDT 2025
Wed Aug 27 02:59:46 EDT 2025
IsPeerReviewed true
IsScholarly true
Issue 12
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c291t-525aea318983663dc184f7a07330cec792cf6be5ad52e029e78e3d3490affc363
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ORCID 0000-0001-9425-843X
PQID 2081957197
PQPubID 85464
PageCount 11
ParticipantIDs ieee_primary_8331855
crossref_citationtrail_10_1109_TIE_2018_2823667
crossref_primary_10_1109_TIE_2018_2823667
proquest_journals_2081957197
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2018-12-01
PublicationDateYYYYMMDD 2018-12-01
PublicationDate_xml – month: 12
  year: 2018
  text: 2018-12-01
  day: 01
PublicationDecade 2010
PublicationPlace New York
PublicationPlace_xml – name: New York
PublicationTitle IEEE transactions on industrial electronics (1982)
PublicationTitleAbbrev TIE
PublicationYear 2018
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref35
ref13
ref12
ref37
ref15
ref36
ref14
ref31
ref30
ref33
ref11
yun (ref32) 2008; 57
ref10
ref2
ref39
ref17
ref38
ref19
ref18
wang (ref34) 2015; 31
ref24
ref23
ref26
ref25
ref20
ref41
ref22
ref21
ref28
ref27
ref29
ref8
ref7
wang (ref1) 2015; 8
ref9
ref4
ref3
ref6
maric (ref16) 2017; 39
ref5
ref40
References_xml – ident: ref35
  doi: 10.1007/s11431-010-4054-9
– ident: ref14
  doi: 10.3390/s140406012
– ident: ref17
  doi: 10.1177/0278364915627291
– ident: ref39
  doi: 10.1109/TAC.2017.2720970
– ident: ref33
  doi: 10.1109/TII.2016.2526674
– ident: ref40
  doi: 10.1109/TFUZZ.2017.2710952
– ident: ref15
  doi: 10.1109/TIE.2014.2301728
– ident: ref22
  doi: 10.1016/j.cviu.2016.03.015
– ident: ref11
  doi: 10.1109/LRA.2017.2678546
– volume: 31
  start-page: 45
  year: 2015
  ident: ref34
  article-title: Research of natural language understanding in human-service robot interaction
  publication-title: Microcomput Appl
– ident: ref3
  doi: 10.4028/www.scientific.net/AMM.330.648
– ident: ref5
  doi: 10.3390/s120608301
– volume: 39
  start-page: 2659
  year: 2017
  ident: ref16
  article-title: Kinematics-based approach for robot programming via human arm motion
  publication-title: J Braz Soc Mech Sci Eng
  doi: 10.1007/s40430-016-0662-z
– ident: ref21
  doi: 10.1016/j.robot.2015.03.002
– ident: ref19
  doi: 10.1016/j.robot.2016.08.012
– ident: ref4
  doi: 10.1016/j.patrec.2014.11.010
– ident: ref27
  doi: 10.1002/asjc.1052
– ident: ref10
  doi: 10.1109/TBME.2014.2303052
– ident: ref41
  doi: 10.1109/TSMC.2017.2720968
– ident: ref20
  doi: 10.3115/990820.990874
– ident: ref37
  doi: 10.1109/TCYB.2016.2581220
– ident: ref2
  doi: 10.1109/LRA.2016.2531124
– ident: ref29
  doi: 10.1109/TFUZZ.2017.2686352
– ident: ref31
  doi: 10.1108/IR-04-2015-0065
– ident: ref36
  doi: 10.1016/j.robot.2015.08.005
– ident: ref13
  doi: 10.1108/01439911011018911
– ident: ref9
  doi: 10.1109/HUMANOIDS.2014.7041495
– ident: ref7
  doi: 10.1007/s11771-012-1033-2
– ident: ref26
  doi: 10.1109/TIE.2009.2032431
– ident: ref6
  doi: 10.1109/TSMC.2015.2461186
– ident: ref38
  doi: 10.1109/TFUZZ.2016.2617378
– ident: ref24
  doi: 10.1109/TIE.2008.2010166
– volume: 57
  start-page: 638
  year: 2008
  ident: ref32
  article-title: A simplified quaternion based algorithm for orientation estimation from Earth gravity and magnetic field measurements
  publication-title: IEEE Trans Instrum Meas
  doi: 10.1109/TIM.2007.911646
– volume: 8
  start-page: 107
  year: 2015
  ident: ref1
  article-title: Industrial robot components assembly based on machine vision technology
  publication-title: Modular Mach Tool Autom Manuf Techn
– ident: ref30
  doi: 10.1109/MAES.2013.6642828
– ident: ref25
  doi: 10.1109/ISIE.2007.4374828
– ident: ref8
  doi: 10.1016/j.csl.2015.11.002
– ident: ref28
  doi: 10.1016/j.ins.2013.12.042
– ident: ref18
  doi: 10.1016/j.eswa.2015.01.016
– ident: ref23
  doi: 10.1016/j.jvcir.2016.05.020
– ident: ref12
  doi: 10.3390/s130912406
SSID ssj0014515
Score 2.5338552
Snippet With the development of Industry 4.0, robots tend to be intelligent and collaborative. For one, robots can interact naturally with humans. For another, robots...
SourceID proquest
crossref
ieee
SourceType Aggregation Database
Enrichment Source
Index Database
Publisher
StartPage 9571
SubjectTerms Audio data
CAI
Collaboration
Computer assisted instruction
Distance learning
Education
Gesture
human–robot interaction
Industrial development
Kalman filters
Maximum entropy
natural
natural speech understanding
On-line systems
online robot teaching
Robot arms
Robot kinematics
Robot sensing systems
Robots
Service robots
Speech
Task analysis
Teaching methods
Title Online Robot Teaching With Natural Human-Robot Interaction
URI https://ieeexplore.ieee.org/document/8331855
https://www.proquest.com/docview/2081957197
Volume 65
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV07T8MwED61nWDg0YIoFJSBBYmmSew8zIYQVUFqB9SKbpHtXAQCNQjShV-PH0lAgBBLlMEXWXeOfee7-z6AU658etQ5w8jP-JBmqPZBGasHkVGQa8i4SPcOT2fRZEFvl-GyBedNLwwimuIzdPWryeVnhVzrq7JRQnSvb9iGtgrcbK9WkzGgoWUrCDRirAr66pSkx0bzm2tdw5W4gWb3Nozyn0eQ4VT5sRGb02W8DdN6Xrao5Mldl8KV798gG_878R3YqtxM59Kui11o4aqrGZqrao4ubH4BIuzBhUUcde4KUZTOvKqwdO4fywdnxg00h2Ou-4d2hLlHtC0Re7AYX8-vJsOKVWEoA-aXKvIMOXI1H5YoZZBMqhgvj7kmb_QkypgFMo8EhjwLA_QChnGCJCOUeTzPJYnIPnRWxQoPwEkSISjyKKGEK0fEF56IkRGKIqZM-LIPo1rRqawgxzXzxXNqQg-Ppco0qTZNWpmmD2eNxIuF2_hjbE9ruhlXKbkPg9qWafU_vik5nS-MfRYf_i51BBv627ZQZQCd8nWNx8rdKMWJWWcftSnOwg
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV07T8MwED6VMgADjxZEoUAGFiTaJnFeZkOIqoW2A0pFt8h2LgKBWgTpwq_HjyQgQIglyuBTrDvHvvPdfR_AKZM-PaqcYeCkrOOlKPdBEcoHEYGbKci4QPUOjyfBYOrdzPxZDc6rXhhE1MVn2FWvOpefLsRSXZX1IqJ6ff0VWJXnvu-Ybq0qZ-D5hq_AVZixMuwrk5I27cXDa1XFFXVdxe-tOeU_DyHNqvJjK9bnS38LxuXMTFnJU3eZ8654_wba-N-pb8Nm4Whal2Zl7EAN5w3F0VzUczRg4wsUYRMuDOaodbfgi9yKixpL6_4xf7AmTINzWPrCv2NG6JtE0xSxC9P-dXw16BS8Ch3hUieXsafPkMn50Egqg6RCRnlZyBR9oy1QhNQVWcDRZ6nvou1SDCMkKfGozbJMkIDsQX2-mOM-WFHEuYcsiDzCpCvicJuHSImHPPQod0QLeqWiE1GAjivui-dEBx82TaRpEmWapDBNC84qiRcDuPHH2KbSdDWuUHIL2qUtk-KPfJNyKmMYOjQ8-F3qBNYG8XiUjIaT20NYV98xZSttqOevSzySzkfOj_Wa-wCsMNIL
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Online+Robot+Teaching+With+Natural+Human%E2%80%93Robot+Interaction&rft.jtitle=IEEE+transactions+on+industrial+electronics+%281982%29&rft.au=Du%2C+Guanglong&rft.au=Chen%2C+Mingxuan&rft.au=Liu%2C+Caibing&rft.au=Zhang%2C+Bo&rft.date=2018-12-01&rft.pub=The+Institute+of+Electrical+and+Electronics+Engineers%2C+Inc.+%28IEEE%29&rft.issn=0278-0046&rft.eissn=1557-9948&rft.volume=65&rft.issue=12&rft.spage=9571&rft_id=info:doi/10.1109%2FTIE.2018.2823667&rft.externalDBID=NO_FULL_TEXT
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0278-0046&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0278-0046&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0278-0046&client=summon