Improving Haptic Response for Contextual Human Robot Interaction

For haptic interaction, a user in a virtual environment needs to interact with proxies attached to a robot. The device must be at the exact location defined in the virtual environment in time. However, due to device limitations, delays are always unavoidable. One of the solutions to improve the devi...

Full description

Saved in:
Bibliographic Details
Published inSensors (Basel, Switzerland) Vol. 22; no. 5; p. 2040
Main Authors Mugisha, Stanley, Guda, Vamsi Krisha, Chevallereau, Christine, Zoppi, Matteo, Molfino, Rezia, Chablat, Damien
Format Journal Article
LanguageEnglish
Published Switzerland MDPI AG 05.03.2022
MDPI
Subjects
Online AccessGet full text
ISSN1424-8220
1424-8220
DOI10.3390/s22052040

Cover

Abstract For haptic interaction, a user in a virtual environment needs to interact with proxies attached to a robot. The device must be at the exact location defined in the virtual environment in time. However, due to device limitations, delays are always unavoidable. One of the solutions to improve the device response is to infer human intended motion and move the robot at the earliest time possible to the desired goal. This paper presents an experimental study to improve the prediction time and reduce the robot time taken to reach the desired position. We developed motion strategies based on the hand motion and eye-gaze direction to determine the point of user interaction in a virtual environment. To assess the performance of the strategies, we conducted a subject-based experiment using an exergame for reach and grab tasks designed for upper limb rehabilitation training. The experimental results in this study revealed that eye-gaze-based prediction significantly improved the detection time by 37% and the robot time taken to reach the target by 27%. Further analysis provided more insight on the effect of the eye-gaze window and the hand threshold on the device response for the experimental task.
AbstractList For haptic applications, a user in a virtual environment needs to interact with proxies attached to a robot. The device must be at the exact location defined in the virtual environment in time. However, due to device limitations, delays are always unavoidable. One of the solutions to improve device response is to infer human intended motion and move the robot at the earliest time possible to the desired goal. This paper presents an experimental study to improve prediction time and reduce the robot time to reach the desired position. We developed motion strategies based on the hand motion and eye-gaze direction to determine the point of user interaction in a virtual environment. To assess the performance of the strategies, we conducted a subject-based experiment using an exergame for reach and grab tasks designed for upper limb rehabilitation training. Experimental results in this study revealed that eye-gaze-based prediction significantly improved detection time by 37% and the robot time to reach the target by 27%. Further analysis provided more insight on the effect of eye-gaze window and the hand threshold on the device response for the experimental task.
For haptic interaction, a user in a virtual environment needs to interact with proxies attached to a robot. The device must be at the exact location defined in the virtual environment in time. However, due to device limitations, delays are always unavoidable. One of the solutions to improve the device response is to infer human intended motion and move the robot at the earliest time possible to the desired goal. This paper presents an experimental study to improve the prediction time and reduce the robot time taken to reach the desired position. We developed motion strategies based on the hand motion and eye-gaze direction to determine the point of user interaction in a virtual environment. To assess the performance of the strategies, we conducted a subject-based experiment using an exergame for reach and grab tasks designed for upper limb rehabilitation training. The experimental results in this study revealed that eye-gaze-based prediction significantly improved the detection time by 37% and the robot time taken to reach the target by 27%. Further analysis provided more insight on the effect of the eye-gaze window and the hand threshold on the device response for the experimental task.
For haptic interaction, a user in a virtual environment needs to interact with proxies attached to a robot. The device must be at the exact location defined in the virtual environment in time. However, due to device limitations, delays are always unavoidable. One of the solutions to improve the device response is to infer human intended motion and move the robot at the earliest time possible to the desired goal. This paper presents an experimental study to improve the prediction time and reduce the robot time taken to reach the desired position. We developed motion strategies based on the hand motion and eye-gaze direction to determine the point of user interaction in a virtual environment. To assess the performance of the strategies, we conducted a subject-based experiment using an exergame for reach and grab tasks designed for upper limb rehabilitation training. The experimental results in this study revealed that eye-gaze-based prediction significantly improved the detection time by 37% and the robot time taken to reach the target by 27%. Further analysis provided more insight on the effect of the eye-gaze window and the hand threshold on the device response for the experimental task.For haptic interaction, a user in a virtual environment needs to interact with proxies attached to a robot. The device must be at the exact location defined in the virtual environment in time. However, due to device limitations, delays are always unavoidable. One of the solutions to improve the device response is to infer human intended motion and move the robot at the earliest time possible to the desired goal. This paper presents an experimental study to improve the prediction time and reduce the robot time taken to reach the desired position. We developed motion strategies based on the hand motion and eye-gaze direction to determine the point of user interaction in a virtual environment. To assess the performance of the strategies, we conducted a subject-based experiment using an exergame for reach and grab tasks designed for upper limb rehabilitation training. The experimental results in this study revealed that eye-gaze-based prediction significantly improved the detection time by 37% and the robot time taken to reach the target by 27%. Further analysis provided more insight on the effect of the eye-gaze window and the hand threshold on the device response for the experimental task.
Author Mugisha, Stanley
Chablat, Damien
Zoppi, Matteo
Guda, Vamsi Krisha
Chevallereau, Christine
Molfino, Rezia
AuthorAffiliation 2 CNRS, LS2N, UMR 6004, 1 Rue de la Noë, 44321 Nantes, France; vamsikrishna.guda@ls2n.fr (V.K.G.); christine.chevallereau@ls2n.fr (C.C.); damien.chablat@cnrs.fr (D.C.)
1 Dipartimento di Ingegneria Meccanica, Energetica, Gestionale e dei Trasporti, University of Genova, Via All’Opera Pia, 15, 16145 Genova, Italy; matteo.zoppi@unige.it (M.Z.); rezia.molfino@unige.it (R.M.)
AuthorAffiliation_xml – name: 2 CNRS, LS2N, UMR 6004, 1 Rue de la Noë, 44321 Nantes, France; vamsikrishna.guda@ls2n.fr (V.K.G.); christine.chevallereau@ls2n.fr (C.C.); damien.chablat@cnrs.fr (D.C.)
– name: 1 Dipartimento di Ingegneria Meccanica, Energetica, Gestionale e dei Trasporti, University of Genova, Via All’Opera Pia, 15, 16145 Genova, Italy; matteo.zoppi@unige.it (M.Z.); rezia.molfino@unige.it (R.M.)
Author_xml – sequence: 1
  givenname: Stanley
  orcidid: 0000-0002-0046-6850
  surname: Mugisha
  fullname: Mugisha, Stanley
– sequence: 2
  givenname: Vamsi Krisha
  surname: Guda
  fullname: Guda, Vamsi Krisha
– sequence: 3
  givenname: Christine
  surname: Chevallereau
  fullname: Chevallereau, Christine
– sequence: 4
  givenname: Matteo
  surname: Zoppi
  fullname: Zoppi, Matteo
– sequence: 5
  givenname: Rezia
  surname: Molfino
  fullname: Molfino, Rezia
– sequence: 6
  givenname: Damien
  orcidid: 0000-0001-7847-6162
  surname: Chablat
  fullname: Chablat, Damien
BackLink https://www.ncbi.nlm.nih.gov/pubmed/35271188$$D View this record in MEDLINE/PubMed
https://hal.science/hal-03599147$$DView record in HAL
BookMark eNplkk1v1DAQhiNURD_gwB9AkbjAYam_EscXRLUq7EorIVVwtibOeJtVYgc7WcG_r8O2S1tOtl4_886MZ86zE-cdZtlbSj5xrshlZIwUjAjyIjujgolFlYSTR_fT7DzGHSGMc169yk55wSSlVXWWfVn3Q_D71m3zFQxja_IbjIN3EXPrQ770bsTf4wRdvpp6cPmNr_2Yr5MawIytd6-zlxa6iG_uz4vs59frH8vVYvP923p5tVmYolTjoilLsByoBWykQgNoGTIUteKm5sZyhgTrRkiDFjhUskBVk6ouQdYCassvsvXBt_Gw00Noewh_tIdW_xV82GoIqf4OtSK2RARODSlEw0hNmRWqkgJUSoEyeX0-eA1T3WNj0I0BuiemT19ce6u3fq8rRYUSs8HHg8Hts7DV1UbPGuGFSqzc08R-uE8W_K8J46j7NhrsOnDop6hZyStJuSAkoe-foTs_BZe-daakVJRKkah3j6s_5n8YagIuD4AJPsaAVpt2hHlYqZm205ToeW30cW3-9XOMeDD9n70DA-LB4w
CitedBy_id crossref_primary_10_1007_s10845_024_02362_x
crossref_primary_10_1016_j_birob_2023_100131
crossref_primary_10_1109_ACCESS_2024_3400604
crossref_primary_10_3934_era_2023121
crossref_primary_10_1155_ahci_8685903
crossref_primary_10_3390_s22207729
Cites_doi 10.1109/THMS.2014.2303083
10.1016/j.heliyon.2018.e00526
10.1080/01621459.1972.10481232
10.1145/3290605.3300589
10.1016/j.brainresbull.2010.02.009
10.1145/3313831.3376476
10.1109/IROS40897.2019.8968559
10.1109/ICAR53236.2021.9659320
10.1109/TVCG.2020.2973060
10.1145/3379337.3415870
10.1007/978-1-4613-8122-8
10.1007/s002210100745
10.1109/IROS40897.2019.8968192
10.1145/3282894.3282898
10.1109/TSMC.2015.2468675
10.1145/3313831.3376286
10.1109/TASE.2016.2624279
10.1162/pres.1997.6.1.118
10.1109/SSCI.2017.8285207
10.1109/IROS.2017.8206572
10.1109/HRI.2016.7451747
10.1145/3003715.3005461
10.1145/3334480.3382924
10.1109/WHC.2011.5945483
10.1109/ACCESS.2020.3045994
10.1111/cgf.12603
10.1109/IROS40897.2019.8967927
10.1007/978-1-4471-6392-3
10.1145/3025453.3025753
10.1109/ROMAN.2011.6005258
10.1109/LRA.2020.3005892
10.3390/robotics10020054
10.1145/3173574.3173865
10.1109/ICRA.2011.5980248
10.1115/DETC2021-71518
10.1145/2839462.2839484
10.1109/VRAIS.1996.490509
10.3414/ME14-01-0125
10.1109/TCDS.2019.2959071
10.1145/3313831.3376523
10.1007/BF00229627
10.2307/2281326
ContentType Journal Article
Copyright 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Distributed under a Creative Commons Attribution 4.0 International License
2022 by the authors. 2022
Copyright_xml – notice: 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
– notice: Distributed under a Creative Commons Attribution 4.0 International License
– notice: 2022 by the authors. 2022
DBID AAYXX
CITATION
CGR
CUY
CVF
ECM
EIF
NPM
3V.
7X7
7XB
88E
8FI
8FJ
8FK
ABUWG
AFKRA
AZQEC
BENPR
CCPQU
DWQXO
FYUFA
GHDGH
K9.
M0S
M1P
PHGZM
PHGZT
PIMPY
PJZUB
PKEHL
PPXIY
PQEST
PQQKQ
PQUKI
PRINS
7X8
1XC
VOOES
5PM
DOA
DOI 10.3390/s22052040
DatabaseName CrossRef
Medline
MEDLINE
MEDLINE (Ovid)
MEDLINE
MEDLINE
PubMed
ProQuest Central (Corporate)
Health & Medical Collection
ProQuest Central (purchase pre-March 2016)
Medical Database (Alumni Edition)
Hospital Premium Collection
Hospital Premium Collection (Alumni Edition)
ProQuest Central (Alumni) (purchase pre-March 2016)
ProQuest Central (Alumni)
ProQuest Central UK/Ireland
ProQuest Central Essentials
ProQuest Central
ProQuest One Community College
ProQuest Central Korea
Health Research Premium Collection
Health Research Premium Collection (Alumni)
ProQuest Health & Medical Complete (Alumni)
ProQuest Health & Medical Collection
Medical Database
ProQuest Central Premium
ProQuest One Academic
Publicly Available Content Database
ProQuest Health & Medical Research Collection
ProQuest One Academic Middle East (New)
ProQuest One Health & Nursing
ProQuest One Academic Eastern Edition (DO NOT USE)
ProQuest One Academic
ProQuest One Academic UKI Edition
ProQuest Central China
MEDLINE - Academic
Hyper Article en Ligne (HAL)
Hyper Article en Ligne (HAL) (Open Access)
PubMed Central (Full Participant titles)
DOAJ Directory of Open Access Journals
DatabaseTitle CrossRef
MEDLINE
Medline Complete
MEDLINE with Full Text
PubMed
MEDLINE (Ovid)
Publicly Available Content Database
ProQuest One Academic Middle East (New)
ProQuest Central Essentials
ProQuest Health & Medical Complete (Alumni)
ProQuest Central (Alumni Edition)
ProQuest One Community College
ProQuest One Health & Nursing
ProQuest Central China
ProQuest Central
ProQuest Health & Medical Research Collection
Health Research Premium Collection
Health and Medicine Complete (Alumni Edition)
ProQuest Central Korea
Health & Medical Research Collection
ProQuest Central (New)
ProQuest Medical Library (Alumni)
ProQuest One Academic Eastern Edition
ProQuest Hospital Collection
Health Research Premium Collection (Alumni)
ProQuest Hospital Collection (Alumni)
ProQuest Health & Medical Complete
ProQuest Medical Library
ProQuest One Academic UKI Edition
ProQuest One Academic
ProQuest One Academic (New)
ProQuest Central (Alumni)
MEDLINE - Academic
DatabaseTitleList

CrossRef
Publicly Available Content Database
MEDLINE - Academic
MEDLINE

Database_xml – sequence: 1
  dbid: DOA
  name: DOAJ Directory of Open Access Journals
  url: https://www.doaj.org/
  sourceTypes: Open Website
– sequence: 2
  dbid: NPM
  name: PubMed
  url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 3
  dbid: EIF
  name: MEDLINE
  url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search
  sourceTypes: Index Database
– sequence: 4
  dbid: BENPR
  name: ProQuest Central
  url: https://www.proquest.com/central
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
Computer Science
EISSN 1424-8220
ExternalDocumentID oai_doaj_org_article_90f6eea31c054d20b12f49874a9ebde7
PMC8914947
oai_HAL_hal_03599147v1
35271188
10_3390_s22052040
Genre Journal Article
GrantInformation_xml – fundername: LobbyBot project
  grantid: ANR-17-CE33
– fundername: regione liguria
  grantid: ARGE17-992/10/1
GroupedDBID ---
123
2WC
53G
5VS
7X7
88E
8FE
8FG
8FI
8FJ
AADQD
AAHBH
AAYXX
ABDBF
ABUWG
ACUHS
ADBBV
ADMLS
AENEX
AFKRA
AFZYC
ALIPV
ALMA_UNASSIGNED_HOLDINGS
BENPR
BPHCQ
BVXVI
CCPQU
CITATION
CS3
D1I
DU5
E3Z
EBD
ESX
F5P
FYUFA
GROUPED_DOAJ
GX1
HH5
HMCUK
HYE
IAO
ITC
KQ8
L6V
M1P
M48
MODMG
M~E
OK1
OVT
P2P
P62
PHGZM
PHGZT
PIMPY
PQQKQ
PROAC
PSQYO
RNS
RPM
TUS
UKHRP
XSB
~8M
CGR
CUY
CVF
ECM
EIF
NPM
PJZUB
PPXIY
3V.
7XB
8FK
AZQEC
DWQXO
K9.
PKEHL
PQEST
PQUKI
PRINS
7X8
PUEGO
1XC
VOOES
5PM
ID FETCH-LOGICAL-c569t-d66af3a1faed79ecaef2e2e4b93cb3cf32e0ebd47cefa3a875e9b08b6a7b4abf3
IEDL.DBID M48
ISSN 1424-8220
IngestDate Wed Aug 27 01:02:22 EDT 2025
Thu Aug 21 13:23:22 EDT 2025
Fri May 09 12:20:55 EDT 2025
Fri Sep 05 14:19:02 EDT 2025
Fri Jul 25 20:20:21 EDT 2025
Mon Jul 21 06:04:13 EDT 2025
Tue Jul 01 02:41:47 EDT 2025
Thu Apr 24 22:53:58 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 5
Keywords haptic devices
virtual reality
response time
human–robot interaction
eye–gaze tracking
Haptic devices
Human robot interaction
Eye gaze prediction
Language English
License https://creativecommons.org/licenses/by/4.0
Distributed under a Creative Commons Attribution 4.0 International License: http://creativecommons.org/licenses/by/4.0
Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c569t-d66af3a1faed79ecaef2e2e4b93cb3cf32e0ebd47cefa3a875e9b08b6a7b4abf3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ORCID 0000-0001-7847-6162
0000-0002-0046-6850
0000-0002-1929-5211
OpenAccessLink http://journals.scholarsportal.info/openUrl.xqy?doi=10.3390/s22052040
PMID 35271188
PQID 2637791174
PQPubID 2032333
ParticipantIDs doaj_primary_oai_doaj_org_article_90f6eea31c054d20b12f49874a9ebde7
pubmedcentral_primary_oai_pubmedcentral_nih_gov_8914947
hal_primary_oai_HAL_hal_03599147v1
proquest_miscellaneous_2638713400
proquest_journals_2637791174
pubmed_primary_35271188
crossref_citationtrail_10_3390_s22052040
crossref_primary_10_3390_s22052040
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 20220305
PublicationDateYYYYMMDD 2022-03-05
PublicationDate_xml – month: 3
  year: 2022
  text: 20220305
  day: 5
PublicationDecade 2020
PublicationPlace Switzerland
PublicationPlace_xml – name: Switzerland
– name: Basel
PublicationTitle Sensors (Basel, Switzerland)
PublicationTitleAlternate Sensors (Basel)
PublicationYear 2022
Publisher MDPI AG
MDPI
Publisher_xml – name: MDPI AG
– name: MDPI
References ref_50
ref_13
ref_12
ref_51
ref_18
Callens (ref_34) 2020; 5
ref_17
Ruhland (ref_38) 2015; 34
ref_16
ref_15
Strauss (ref_19) 2020; 26
Guo (ref_11) 2021; 13
Alves (ref_14) 2016; 55
Ravichandar (ref_30) 2017; 14
Gruenbaum (ref_22) 1997; 6
ref_25
ref_24
ref_23
ref_21
ref_20
Lee (ref_3) 2016; 46
ref_29
Shapiro (ref_45) 1972; 67
ref_28
ref_27
ref_26
Zaraki (ref_10) 2014; 44
ref_36
Tukey (ref_47) 1952; 47
ref_35
ref_33
ref_31
Esfahlani (ref_39) 2018; 4
ref_37
Ernst (ref_4) 2011; 85
Pelz (ref_6) 2001; 139
ref_46
ref_44
ref_43
ref_42
ref_41
ref_40
ref_1
Li (ref_32) 2020; 8
ref_2
ref_49
ref_48
ref_9
ref_8
ref_5
Smeets (ref_7) 1996; 109
References_xml – volume: 44
  start-page: 157
  year: 2014
  ident: ref_10
  article-title: Designing and Evaluating a Social Gaze-Control System for a Humanoid Robot
  publication-title: IEEE Trans. Hum.-Mach. Syst.
  doi: 10.1109/THMS.2014.2303083
– volume: 4
  start-page: e00526
  year: 2018
  ident: ref_39
  article-title: ReHabgame: A non-immersive virtual reality rehabilitation system with applications in neuroscience
  publication-title: Heliyon
  doi: 10.1016/j.heliyon.2018.e00526
– ident: ref_49
– ident: ref_51
– volume: 67
  start-page: 215
  year: 1972
  ident: ref_45
  article-title: An approximate analysis of variance test for normality
  publication-title: J. Am. Stat. Assoc.
  doi: 10.1080/01621459.1972.10481232
– ident: ref_29
  doi: 10.1145/3290605.3300589
– volume: 85
  start-page: 245
  year: 2011
  ident: ref_4
  article-title: Effects of visual-haptic asynchronies and loading-unloading movements on compliance perception
  publication-title: Brain Res. Bull.
  doi: 10.1016/j.brainresbull.2010.02.009
– ident: ref_25
  doi: 10.1145/3313831.3376476
– ident: ref_31
  doi: 10.1109/IROS40897.2019.8968559
– ident: ref_13
  doi: 10.1109/ICAR53236.2021.9659320
– ident: ref_8
– volume: 26
  start-page: 1955
  year: 2020
  ident: ref_19
  article-title: A Steering Algorithm for Redirected Walking Using Reinforcement Learning
  publication-title: IEEE Trans. Vis. Comput. Graph.
  doi: 10.1109/TVCG.2020.2973060
– ident: ref_2
  doi: 10.1145/3379337.3415870
– ident: ref_41
– ident: ref_46
  doi: 10.1007/978-1-4613-8122-8
– volume: 139
  start-page: 266
  year: 2001
  ident: ref_6
  article-title: The coordination of eye, head, and hand movements in a natural task
  publication-title: Exp. Brain Res.
  doi: 10.1007/s002210100745
– ident: ref_17
– ident: ref_35
  doi: 10.1109/IROS40897.2019.8968192
– ident: ref_28
  doi: 10.1145/3282894.3282898
– volume: 46
  start-page: 1098
  year: 2016
  ident: ref_3
  article-title: Impact of Visual-Haptic Spatial Discrepancy on Targeting Performance
  publication-title: IEEE Trans. Syst. Man Cybern. Syst.
  doi: 10.1109/TSMC.2015.2468675
– ident: ref_20
– ident: ref_27
  doi: 10.1145/3313831.3376286
– volume: 14
  start-page: 855
  year: 2017
  ident: ref_30
  article-title: Human Intention Inference Using Expectation-Maximization Algorithm With Online Model Learning
  publication-title: IEEE Trans. Autom. Sci. Eng.
  doi: 10.1109/TASE.2016.2624279
– volume: 6
  start-page: 118
  year: 1997
  ident: ref_22
  article-title: Implementation of Dynamic Robotic Graphics for a Virtual Control Panel
  publication-title: Presence Teleoper. Virtual Environ.
  doi: 10.1162/pres.1997.6.1.118
– ident: ref_16
  doi: 10.1109/SSCI.2017.8285207
– ident: ref_33
  doi: 10.1109/IROS.2017.8206572
– ident: ref_43
  doi: 10.1109/HRI.2016.7451747
– ident: ref_37
  doi: 10.1145/3003715.3005461
– ident: ref_48
  doi: 10.1145/3334480.3382924
– ident: ref_9
  doi: 10.1109/WHC.2011.5945483
– volume: 8
  start-page: 227690
  year: 2020
  ident: ref_32
  article-title: Data Driven Models for Human Motion Prediction in Human-Robot Collaboration
  publication-title: IEEE Access
  doi: 10.1109/ACCESS.2020.3045994
– ident: ref_40
– volume: 34
  start-page: 299
  year: 2015
  ident: ref_38
  article-title: A Review of Eye Gaze in Virtual Agents, Social Robotics and HCI: Behaviour Generation, User Interaction and Perception
  publication-title: Comput. Graph. Forum
  doi: 10.1111/cgf.12603
– ident: ref_15
  doi: 10.1109/IROS40897.2019.8967927
– ident: ref_44
– ident: ref_21
– ident: ref_50
  doi: 10.1007/978-1-4471-6392-3
– ident: ref_18
  doi: 10.1145/3025453.3025753
– ident: ref_42
  doi: 10.1109/ROMAN.2011.6005258
– volume: 5
  start-page: 5151
  year: 2020
  ident: ref_34
  article-title: A Framework for Recognition and Prediction of Human Motions in Human-Robot Collaboration Using Probabilistic Motion Models
  publication-title: IEEE Robot. Autom. Lett.
  doi: 10.1109/LRA.2020.3005892
– ident: ref_12
  doi: 10.3390/robotics10020054
– ident: ref_24
  doi: 10.1145/3173574.3173865
– ident: ref_36
  doi: 10.1109/ICRA.2011.5980248
– ident: ref_5
  doi: 10.1115/DETC2021-71518
– ident: ref_23
  doi: 10.1145/2839462.2839484
– ident: ref_1
  doi: 10.1109/VRAIS.1996.490509
– volume: 55
  start-page: 79
  year: 2016
  ident: ref_14
  article-title: Eye Gaze Correlates of Motor Impairment in VR Observation of Motor Actions
  publication-title: Methods Inf. Med.
  doi: 10.3414/ME14-01-0125
– volume: 13
  start-page: 179
  year: 2021
  ident: ref_11
  article-title: A Novel Robotic Guidance System With Eye-Gaze Tracking Control for Needle-Based Interventions
  publication-title: IEEE Trans. Cogn. Dev. Syst.
  doi: 10.1109/TCDS.2019.2959071
– ident: ref_26
  doi: 10.1145/3313831.3376523
– volume: 109
  start-page: 434
  year: 1996
  ident: ref_7
  article-title: Goal-directed arm movements change eye-head coordination
  publication-title: Exp. Brain Res.
  doi: 10.1007/BF00229627
– volume: 47
  start-page: 554
  year: 1952
  ident: ref_47
  article-title: Statistical Methods for Natural Scientists, Medical Men, and Engineers
  publication-title: J. Am. Stat. Assoc.
  doi: 10.2307/2281326
SSID ssj0023338
Score 2.3902733
Snippet For haptic interaction, a user in a virtual environment needs to interact with proxies attached to a robot. The device must be at the exact location defined in...
For haptic applications, a user in a virtual environment needs to interact with proxies attached to a robot. The device must be at the exact location defined...
SourceID doaj
pubmedcentral
hal
proquest
pubmed
crossref
SourceType Open Website
Open Access Repository
Aggregation Database
Index Database
Enrichment Source
StartPage 2040
SubjectTerms Computer Science
eye–gaze tracking
Hand - physiology
haptic devices
Haptic Technology
Haptics
Humans
human–robot interaction
Motivation
Neural networks
Principal components analysis
response time
Robotics
Robotics - methods
Robots
Upper Extremity
Virtual reality
SummonAdditionalLinks – databaseName: DOAJ Directory of Open Access Journals
  dbid: DOA
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1BT90wDI4Qp3GYYDBWYFNAHHapaJM0aW6wCfSEgAMCiVuVpI6YhPoQPKb9fOy0r3qPTdqFa-I2kZ00_lz7C2OHdahARWFzupMmV2Cq3NfK5q71xFzpdUwkSZdXenKrzu-qu4WrvignrKcH7hV3ZIuoAZwsAzoXrSh8KaJCoKycBd9CqiMvbDEHUwPUkoi8eh4hiaD-6JnKSUVBEY6F0yeR9OOZck8pkH_7l2_TJBfOnbN19nFwGPlJP9ENtgLdJ7a2QCO4yY7HyACfOPwEBH7dZ74CR5eUJwKqP1QnwlPInl9P_XTGUyywL2vYYrdnpzc_J_lwM0IeKm1neau1i9KV0UFrLAQHUYAA5a0MXoYoBRSoHWUCRCcdYhKwvqi9dsYr56P8zFa7aQdfGA-VawGqYJRHZ0pFi0_ooEQMBhzKZuz7XGNNGGjD6faKhwbhAym3GZWbsYNR9LHnyviX0A9S-yhA9NapAY3eDEZv_md0HAmNtvSOyclFQ21ESWhLZX6XGdub27QZNuZzIzQxLJaIwzK2P3bjlqL_JK6D6UuSqanEtsDJbvdLYBwK_VWDmKzOmFlaHEtzWe7pft0n2u4ap2WV2XkPBeyyD4LqMCgZrtpjq7OnF_iK3tHMf0sb4RUYzxBn
  priority: 102
  providerName: Directory of Open Access Journals
– databaseName: ProQuest Technology Collection
  dbid: 8FG
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV3Nb9UwDI9gXOCAGOOjsKFs4sClWpukSXOCgXg8TcBhYtJuVZI626SpHXtviD8fO-0r7wHimrht6jiJ7dg_M_a6DhWoKGxONWlyBabKfa1s7lpPyJVexwSS9OWrnp-q47PqbHS4LcawytWemDbqtg_kIz8UmqDxSlSg315_z6lqFN2ujiU07rJ7JZ40JOf17NNkcEm0vwY0IYmm_eGCkkpFQX6OtTMoQfXjyXJBgZB_a5l_BkuunT6zR-zhqDbyo2Get9kd6B6zB2tggjvs3eQf4HOHG0HgJ0P8K3BUTHmCofpJ2SI8Oe75Se_7JU8ewSG54Qk7nX389mGej_UR8lBpu8xbrV2UrowOWmMhOIgCBChvZfAyRCmgAN8qEyA66dAyAeuL2mtnvHI-yqdsq-s7eM54qFwLUAWjPKpUKlp8QgclYjDgkDZjb1Yca8IIHk41LK4aNCKIuc3E3IwdTKTXA2LGv4jeE9snAgK5Tg39zXkzrpnGFlEDOFkG1CtbUfhSRGVro5zF3wKDX8JJ23jH_OhzQ20ETGhLZX6UGdtdzWkzLs9F81uYMrY_dePCotsS10F_m2hqSrQtcLDPBhGYPoVaq0HLrM6Y2RCOjbFs9nSXFwm8u8ZhWWVe_H9YL9l9QXkWFOxW7bKt5c0t7KH2s_Svkoj_AmVeB3U
  priority: 102
  providerName: ProQuest
Title Improving Haptic Response for Contextual Human Robot Interaction
URI https://www.ncbi.nlm.nih.gov/pubmed/35271188
https://www.proquest.com/docview/2637791174
https://www.proquest.com/docview/2638713400
https://hal.science/hal-03599147
https://pubmed.ncbi.nlm.nih.gov/PMC8914947
https://doaj.org/article/90f6eea31c054d20b12f49874a9ebde7
Volume 22
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9QwELb6uMAB8SalrALiwCWQOI4dHxC0qMsK0QqtWGlvke2MW6QqodstKv-eGScbNdADlxzsceyM7czDnm8Ye126AoTnOqGcNIkAVSS2FDoxtSXkSit9AEk6PpGzhfiyLJZbbJNjs2fg5a2mHeWTWqzO315f_P6AG_49WZxosr-7pGBRjqtxm-2iQJJkgx2L4TCB53lIaE0xXQnKw7QDGBo3HYmlgN6PwuaM7kb-q3j-fX_yhkCa3mf3ek0yPuim_gHbguYhu3sDX_AR-zi4DOKZwX-Di-fdlViIUVeNAzLVNQWQxMGXH89b267j4CTs4h0es8X06PunWdKnTEhcIfU6qaU0PjeZN1ArDc6A58BBWJ07mzufc0jB1kI58CY3aKyAtmlppVFWGOvzJ2ynaRt4xmJXmBqgcEpY1LKE19hCOsG9U2CQNmJvNhyrXI8nTmktziu0K4i51cDciL0aSH92IBq3ER0S2wcCwr0OBe3qtOq3UaVTLwFMnjlUNWue2ox7oUsljMbPAoU94aSN3jE7-FpRGWEV6kyoX1nE9jdzWm0WXMUlQS9maKBF7OVQjXuNDlBMA-1VoCkp9jbFwT7tlsDQFSqyCo21MmJqtDhGYxnXND_OAp53icPSQu39R7_P2R1O8Rd0Ca7YZzvr1RW8QK1obSdsWy0VPsvp5wnbPTw6-TafBA_DJOyGP79sEQI
linkProvider Scholars Portal
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Lb9QwELZKOQAHxJtAgYBA4hI1sZ04PiAojyql2x6qVtpbajtjWqlKSnfL40_xG5nJq7uAuPWaTGxnPLZnxjPfMPYydylIz3VENWkiCSqNbC51ZCpLyJU28y1I0s5uVhzIz9N0usJ-DbkwFFY57IntRl01jnzk6zwjaLwEFei3p18jqhpFt6tDCY1OLLbh53c02WZvtj7i_L7ifPPT_oci6qsKRC7N9Dyqssx4YRJvoFIanAHPgYO0WjgrnBccYrCVVA68EQb1edA2zm1mlJXGeoHtXmFXJXnGcf2o6YWBJ9De69CLhNDx-oySWHlMfpWFM68tDYAn2REFXv6t1f4ZnLlw2m3eYjd7NTXc6OTqNluB-g67sQBeeJe9G_0RYWFw43HhXhdvCyEqwmELe_WDslPC9qIg3GtsMw9bD2SXTHGPHVwK5-6z1bqp4SELXWoqgNQpaVGFk17jF5mT3DsFBmkD9nrgWOl6sHKqmXFSotFCzC1H5gbsxUh62iF0_IvoPbF9JCBQ7fZBc_al7NdoqWOfARiRONRjKx7bhHupcyWNxt8ChT3hpC21UWxMSnpGQIg6kepbErC1YU7LfjuYlRfCG7Dn42tcyHQ7Y2pozluanBJ7Yxzsg04Exq5QS1ZoCeYBU0vCsTSW5Tf18VELFp7jsLRUj_4_rGfsWrG_MyknW7vbj9l1TjkeFGiXrrHV-dk5PEHNa26ftuIessPLXl-_AVFBSPo
linkToPdf http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Lb9QwELZKkRAcEG8CBQICiUu0ie3E8QFBoay2tFSootLeUtsZUySUlO6Wx1_j1zHjZNNdQNx6TZzEGc_Y39gz3zD2tHQ5SM91QjVpEgkqT2wpdWJqS8yVtvCBJOn9XjE5kO-m-XSN_VrkwlBY5WJODBN13TraIx_xgqjxMgTQI9-HRXzYGr88_ppQBSk6aV2U0-hUZAd-fkf3bfZiewvH-hnn47cf30ySvsJA4vJCz5O6KIwXJvMGaqXBGfAcOEirhbPCecEhBVtL5cAbYRDbg7ZpaQujrDTWC3zvBXZRCURVaEtqeubsCfT9OiYjIXQ6mlFCK09pj2Vp_QtlAnBVO6IgzL8R7p-Bmksr3_gau9pD1niz07HrbA2aG-zKEpHhTfZq2JuIJwYnIRfvd7G3ECMojgMF1g_KVInDoUG839p2HofdyC6x4hY7OBfJ3WbrTdvAXRa73NQAuVPSIpyTXuMThZPcOwUG20bs-UJileuJy6l-xpcKHRgSbjUIN2JPhqbHHVvHvxq9JrEPDYhgO1xoTz5Vvb1WOvUFgBGZQ0xb89Rm3EtdKmk0_hYo_BIO2so7Jpu7FV0jUkSdSfUti9jGYkyrfmqYVWeKHLHHw200ajqpMQ20p6FNSUm-KXb2TqcCw6cQMSv0CsuIqRXlWOnL6p3m81EgDi-xW1qqe__v1iN2CS2r2t3e27nPLnNK96CYu3yDrc9PTuEBgrC5fRi0PWaH521evwFB9E05
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Improving+Haptic+Response+for+Contextual+Human+Robot+Interaction&rft.jtitle=Sensors+%28Basel%2C+Switzerland%29&rft.au=Mugisha%2C+Stanley&rft.au=Guda%2C+Vamsi+Krisha&rft.au=Chevallereau%2C+Christine&rft.au=Zoppi%2C+Matteo&rft.date=2022-03-05&rft.issn=1424-8220&rft.eissn=1424-8220&rft.volume=22&rft.issue=5&rft_id=info:doi/10.3390%2Fs22052040&rft.externalDBID=NO_FULL_TEXT
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1424-8220&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1424-8220&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1424-8220&client=summon