Augmenting collaborative interaction with shared visualization of eye movement and gesture in VR
Virtual Reality (VR)‐enabled multi‐user collaboration has been gradually applied in academic research and industrial applications, but it still has key problems. First, it is often difficult for users to select or manipulate objects in complex three‐dimesnional spaces, which greatly affects their op...
Saved in:
Published in | Computer animation and virtual worlds Vol. 35; no. 3 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
Chichester
Wiley Subscription Services, Inc
01.05.2024
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | Virtual Reality (VR)‐enabled multi‐user collaboration has been gradually applied in academic research and industrial applications, but it still has key problems. First, it is often difficult for users to select or manipulate objects in complex three‐dimesnional spaces, which greatly affects their operational efficiency. Second, supporting natural communication cues is crucial for cooperation in VR, especially in collaborative tasks, where ambiguous verbal communication cannot effectively assign partners the task of selecting or manipulating objects. To address the above issues, in this paper, we propose a new interaction method, Eye‐Gesture Combination Interaction in VR, to enhance the execution of collaborative tasks by sharing the visualization of eye movement and gesture data among partners. We conducted user experiments and showed that using dots to represent eye gaze and virtual hands to represent gestures can help users complete tasks faster than other visualization methods. Finally, we developed a VR multi‐user collaborative assembly system. The results of the user study show that sharing gaze points and gestures among users can significantly improve the productivity of collaborating users. Our work can effectively improve the efficiency of multi‐user collaborative systems in VR and provide new design guidelines for collaborative systems in VR.
We propose a collaboration method for sharing gaze points and gestures in VR, which can effectively increase the collaboration efficiency between VR users. |
---|---|
AbstractList | Virtual Reality (VR)‐enabled multi‐user collaboration has been gradually applied in academic research and industrial applications, but it still has key problems. First, it is often difficult for users to select or manipulate objects in complex three‐dimesnional spaces, which greatly affects their operational efficiency. Second, supporting natural communication cues is crucial for cooperation in VR, especially in collaborative tasks, where ambiguous verbal communication cannot effectively assign partners the task of selecting or manipulating objects. To address the above issues, in this paper, we propose a new interaction method, Eye‐Gesture Combination Interaction in VR, to enhance the execution of collaborative tasks by sharing the visualization of eye movement and gesture data among partners. We conducted user experiments and showed that using dots to represent eye gaze and virtual hands to represent gestures can help users complete tasks faster than other visualization methods. Finally, we developed a VR multi‐user collaborative assembly system. The results of the user study show that sharing gaze points and gestures among users can significantly improve the productivity of collaborating users. Our work can effectively improve the efficiency of multi‐user collaborative systems in VR and provide new design guidelines for collaborative systems in VR. Virtual Reality (VR)‐enabled multi‐user collaboration has been gradually applied in academic research and industrial applications, but it still has key problems. First, it is often difficult for users to select or manipulate objects in complex three‐dimesnional spaces, which greatly affects their operational efficiency. Second, supporting natural communication cues is crucial for cooperation in VR, especially in collaborative tasks, where ambiguous verbal communication cannot effectively assign partners the task of selecting or manipulating objects. To address the above issues, in this paper, we propose a new interaction method, Eye‐Gesture Combination Interaction in VR, to enhance the execution of collaborative tasks by sharing the visualization of eye movement and gesture data among partners. We conducted user experiments and showed that using dots to represent eye gaze and virtual hands to represent gestures can help users complete tasks faster than other visualization methods. Finally, we developed a VR multi‐user collaborative assembly system. The results of the user study show that sharing gaze points and gestures among users can significantly improve the productivity of collaborating users. Our work can effectively improve the efficiency of multi‐user collaborative systems in VR and provide new design guidelines for collaborative systems in VR. We propose a collaboration method for sharing gaze points and gestures in VR, which can effectively increase the collaboration efficiency between VR users. |
Author | Cheng, Shiwei Zhao, Song Liu, Yang |
Author_xml | – sequence: 1 givenname: Yang orcidid: 0009-0005-0760-9382 surname: Liu fullname: Liu, Yang organization: Zhejiang University of Technology – sequence: 2 givenname: Song surname: Zhao fullname: Zhao, Song organization: Zhejiang University of Technology – sequence: 3 givenname: Shiwei orcidid: 0000-0003-4716-4179 surname: Cheng fullname: Cheng, Shiwei email: 249401866@qq.com organization: Zhejiang University of Technology |
BookMark | eNp1kE9LAzEQxYMo2FbBjxDw4mVrkt3NrsdS_AcFQbR4i9ls0qZsk5pkt9RPb7YVD6KnGZjfezPzhuDYWCMBuMBojBEi14J3Y0JodgQGOM9okpHi7finp_gUDL1fRZISjAbgfdIu1tIEbRZQ2KbhlXU86E5CbYJ0XARtDdzqsIR-yZ2sYad9yxv9yfcTq6DcSbi2nextIDc1XEgfWtc7wPnzGThRvPHy_LuOwOvd7cv0IZk93T9OJ7NEEJJmCU1pxTOCsKBljVFG6zyXJKVFoRAtSV6V6qYmVBUE10rUqJIixTlREcg4r8p0BC4PvhtnP9p4AVvZ1pm4kqWowBSVeU4iNT5QwlnvnVRM6LD_JDiuG4YR61NkMUXWpxgFV78EG6fX3O3-QpMDutWN3P3Lselkvue_ADTQg1s |
CitedBy_id | crossref_primary_10_1016_j_ijhcs_2025_103458 |
Cites_doi | 10.1145/253284.253315 10.1145/2993369.2993396 10.1145/351006.351018 10.1037/1082-989X.3.1.55 10.1080/1357650X.2013.783045 10.1145/3317959.3321489 10.1145/3313831.3376550 10.1145/3411764.3445759 10.1145/3290605.3300727 10.1145/3448018.3457998 10.1007/s10055-023-00748-5 10.1145/2168556.2168568 10.1145/3617368 10.3390/app11125754 10.1109/THMS.2018.2791562 10.1145/3025453.3025573 10.1007/s00170-022-08747-7 10.1016/j.rcim.2020.102071 10.1145/3131277.3132180 10.1007/s00170-018-03237-1 10.1109/VR51125.2022.00044 10.1145/2808435.2808439 10.1145/1460563.1460593 10.1145/3317959.3321492 |
ContentType | Journal Article |
Copyright | 2024 John Wiley & Sons Ltd. 2024 John Wiley & Sons, Ltd. |
Copyright_xml | – notice: 2024 John Wiley & Sons Ltd. – notice: 2024 John Wiley & Sons, Ltd. |
DBID | AAYXX CITATION 7SC 8FD JQ2 L7M L~C L~D |
DOI | 10.1002/cav.2264 |
DatabaseName | CrossRef Computer and Information Systems Abstracts Technology Research Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional |
DatabaseTitle | CrossRef Computer and Information Systems Abstracts Technology Research Database Computer and Information Systems Abstracts – Academic Advanced Technologies Database with Aerospace ProQuest Computer Science Collection Computer and Information Systems Abstracts Professional |
DatabaseTitleList | CrossRef Computer and Information Systems Abstracts |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Visual Arts |
EISSN | 1546-427X |
EndPage | n/a |
ExternalDocumentID | 10_1002_cav_2264 CAV2264 |
Genre | article |
GrantInformation_xml | – fundername: National Natural Science Foundation of China funderid: 62172368; 61772468 – fundername: Zhejiang Provincial Key Research and Development Program funderid: 2023C01045 – fundername: Zhejiang Provincial Natural Science Foundation of China funderid: LR22F020003 |
GroupedDBID | .3N .4S .DC .GA .Y3 05W 0R~ 10A 1L6 1OB 1OC 29F 31~ 33P 3SF 3WU 4.4 50Y 50Z 51W 51X 52M 52N 52O 52P 52S 52T 52U 52W 52X 5GY 5VS 66C 6J9 702 7PT 8-0 8-1 8-3 8-4 8-5 930 A03 AAESR AAEVG AAHQN AAMMB AAMNL AANHP AANLZ AAONW AASGY AAXRX AAYCA AAZKR ABCQN ABCUV ABEML ABIJN ABPVW ACAHQ ACBWZ ACCZN ACGFS ACPOU ACRPL ACSCC ACXBN ACXQS ACYXJ ADBBV ADEOM ADIZJ ADKYN ADMGS ADMLS ADNMO ADOZA ADXAS ADZMN AEFGJ AEIGN AEIMD AENEX AEUYR AFBPY AFFPM AFGKR AFWVQ AFZJQ AGHNM AGQPQ AGXDD AGYGG AHBTC AIDQK AIDYY AITYG AIURR AJXKR ALMA_UNASSIGNED_HOLDINGS ALUQN ALVPJ AMBMR AMYDB ARCSS ASPBG ATUGU AUFTA AVWKF AZBYB AZFZN AZVAB BAFTC BDRZF BFHJK BHBCM BMNLL BROTX BRXPI BY8 CS3 D-E D-F DCZOG DPXWK DR2 DRFUL DRSTM DU5 EBS EDO EJD F00 F01 F04 F5P FEDTE G-S G.N GNP GODZA HF~ HGLYW HHY HVGLF HZ~ I-F ITG ITH IX1 J0M JPC KQQ LATKE LAW LC2 LC3 LEEKS LH4 LITHE LOXES LP6 LP7 LUTES LW6 LYRES MEWTI MK4 MRFUL MRSTM MSFUL MSSTM MXFUL MXSTM N9A NF~ O66 O9- OIG P2W P4D PQQKQ Q.N Q11 QB0 QRW R.K ROL RX1 RYL SUPJJ TN5 TUS UB1 V2E V8K W8V W99 WBKPD WIH WIK WQJ WXSBR WYISQ WZISG XG1 XV2 ~IA ~WT AAHHS AAYXX ACCFJ ADZOD AEEZP AEQDE AIWBW AJBDE CITATION 7SC 8FD JQ2 L7M L~C L~D |
ID | FETCH-LOGICAL-c2234-636ba4201c68d1046d55e23677f06825b8f9d26f721dfcd0bec3152f6774aab83 |
IEDL.DBID | DR2 |
ISSN | 1546-4261 |
IngestDate | Sat Jul 26 03:40:53 EDT 2025 Tue Jul 01 02:42:24 EDT 2025 Thu Apr 24 22:59:38 EDT 2025 Wed Aug 20 07:26:33 EDT 2025 |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 3 |
Language | English |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c2234-636ba4201c68d1046d55e23677f06825b8f9d26f721dfcd0bec3152f6774aab83 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ORCID | 0000-0003-4716-4179 0009-0005-0760-9382 |
PQID | 3071608552 |
PQPubID | 2034909 |
PageCount | 14 |
ParticipantIDs | proquest_journals_3071608552 crossref_citationtrail_10_1002_cav_2264 crossref_primary_10_1002_cav_2264 wiley_primary_10_1002_cav_2264_CAV2264 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | May/June 2024 2024-05-00 20240501 |
PublicationDateYYYYMMDD | 2024-05-01 |
PublicationDate_xml | – month: 05 year: 2024 text: May/June 2024 |
PublicationDecade | 2020 |
PublicationPlace | Chichester |
PublicationPlace_xml | – name: Chichester |
PublicationTitle | Computer animation and virtual worlds |
PublicationYear | 2024 |
Publisher | Wiley Subscription Services, Inc |
Publisher_xml | – name: Wiley Subscription Services, Inc |
References | 2023; 31 2021; 11 2012 2000 2022 2021 2020 2023; 27 2019 1997 2008 2019; 102 2017 2016 1998; 3 2015 2014; 19 2021; 72 2022; 119 2018; 48 e_1_2_10_23_1 e_1_2_10_24_1 e_1_2_10_21_1 e_1_2_10_22_1 e_1_2_10_20_1 Yu D (e_1_2_10_11_1) 2021 e_1_2_10_2_1 e_1_2_10_4_1 e_1_2_10_18_1 e_1_2_10_3_1 e_1_2_10_19_1 e_1_2_10_6_1 e_1_2_10_16_1 e_1_2_10_5_1 e_1_2_10_17_1 e_1_2_10_8_1 e_1_2_10_14_1 e_1_2_10_7_1 e_1_2_10_15_1 e_1_2_10_12_1 e_1_2_10_9_1 e_1_2_10_13_1 e_1_2_10_10_1 e_1_2_10_25_1 e_1_2_10_26_1 |
References_xml | – start-page: 250 year: 2022 end-page: 259 – start-page: 1 year: 2021 end-page: 13 – volume: 27 start-page: 1409 issue: 2 year: 2023 end-page: 1430 article-title: BeHere: a VR/SAR remote collaboration system based on virtual replicas sharing gesture and avatar in a procedural task publication-title: Virtual Real – start-page: 6245 year: 2017 end-page: 6290 – volume: 72 year: 2021 article-title: AR/MR remote collaboration on physical tasks: a review publication-title: Robot Comput Integrat Manuf – volume: 19 start-page: 164 issue: 2 year: 2014 end-page: 177 article-title: Edinburgh handedness inventory–short form: a revised version based on confirmatory factor analysis publication-title: Lateral Asymmetr f Body Brain Cogn – start-page: 107 year: 1997 end-page: 114 – start-page: 261 year: 2016 end-page: 268 – volume: 102 start-page: 1339 year: 2019 end-page: 1353 article-title: 2.5 DHANDS: a gesture‐based MR remote collaborative platform publication-title: Int J Adv Manuf Technol – volume: 31 start-page: 1 issue: 1 year: 2023 end-page: 32 article-title: How gaze visualization facilitates initiation of informal communication in 3D virtual spaces publication-title: ACM Trans Comput Human Interact – volume: 3 start-page: 55 issue: 1 year: 1998 article-title: Analysis of Likert scale data in disability and medical rehabilitation research publication-title: Psychol Methods – volume: 48 start-page: 136 issue: 2 year: 2018 end-page: 148 article-title: Hand‐in‐hand: a communication‐enhancement collaborative virtual reality system for promoting social interaction in children with autism spectrum disorders publication-title: IEEE Trans Human‐Mach Syst – start-page: 83 year: 2000 end-page: 92 – start-page: 197 year: 2008 end-page: 200 – start-page: 1 year: 2021 end-page: 14 – start-page: 74 year: 2015 end-page: 77 – start-page: 1 year: 2019 end-page: 5 – volume: 119 start-page: 6413 issue: 9‐10 year: 2022 end-page: 6421 article-title: A novel AR remote collaborative platform for sharing 2.5 D gestures and gaze publication-title: Int J Adv Manuf Technol – start-page: 1 year: 2019 end-page: 9 – start-page: 1 year: 2021 end-page: 7 – start-page: 1 year: 2020 end-page: 13 – start-page: 75 year: 2012 end-page: 82 – volume: 11 start-page: 5754 issue: 12 year: 2021 article-title: Human–robot collaborative assembly based on eye‐hand and a finite state machine in a virtual environment publication-title: Appl Sci – start-page: 99 year: 2017 end-page: 108 – start-page: 1 year: 2019 end-page: 12 – ident: e_1_2_10_10_1 doi: 10.1145/253284.253315 – ident: e_1_2_10_8_1 doi: 10.1145/2993369.2993396 – ident: e_1_2_10_19_1 doi: 10.1145/351006.351018 – ident: e_1_2_10_21_1 doi: 10.1037/1082-989X.3.1.55 – ident: e_1_2_10_20_1 doi: 10.1080/1357650X.2013.783045 – ident: e_1_2_10_13_1 doi: 10.1145/3317959.3321489 – ident: e_1_2_10_22_1 doi: 10.1145/3313831.3376550 – ident: e_1_2_10_7_1 doi: 10.1145/3411764.3445759 – ident: e_1_2_10_6_1 doi: 10.1145/3290605.3300727 – ident: e_1_2_10_18_1 doi: 10.1145/3448018.3457998 – start-page: 1 volume-title: Proceedings of the 2021 CHI conference on human factors in computing systems year: 2021 ident: e_1_2_10_11_1 – ident: e_1_2_10_24_1 doi: 10.1007/s10055-023-00748-5 – ident: e_1_2_10_3_1 doi: 10.1145/2168556.2168568 – ident: e_1_2_10_25_1 doi: 10.1145/3617368 – ident: e_1_2_10_16_1 doi: 10.3390/app11125754 – ident: e_1_2_10_2_1 doi: 10.1109/THMS.2018.2791562 – ident: e_1_2_10_5_1 doi: 10.1145/3025453.3025573 – ident: e_1_2_10_23_1 doi: 10.1007/s00170-022-08747-7 – ident: e_1_2_10_17_1 doi: 10.1016/j.rcim.2020.102071 – ident: e_1_2_10_12_1 doi: 10.1145/3131277.3132180 – ident: e_1_2_10_4_1 doi: 10.1007/s00170-018-03237-1 – ident: e_1_2_10_26_1 doi: 10.1109/VR51125.2022.00044 – ident: e_1_2_10_9_1 doi: 10.1145/2808435.2808439 – ident: e_1_2_10_15_1 doi: 10.1145/1460563.1460593 – ident: e_1_2_10_14_1 doi: 10.1145/3317959.3321492 |
SSID | ssj0026210 |
Score | 2.3519518 |
Snippet | Virtual Reality (VR)‐enabled multi‐user collaboration has been gradually applied in academic research and industrial applications, but it still has key... |
SourceID | proquest crossref wiley |
SourceType | Aggregation Database Enrichment Source Index Database Publisher |
SubjectTerms | Collaboration collaborative interaction eye movement Eye movements Industrial applications multimodal interaction System effectiveness Verbal communication Virtual reality Visualization |
Title | Augmenting collaborative interaction with shared visualization of eye movement and gesture in VR |
URI | https://onlinelibrary.wiley.com/doi/abs/10.1002%2Fcav.2264 https://www.proquest.com/docview/3071608552 |
Volume | 35 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV3PS8MwFA6ykx78LU6nRBA9dVvbNE2PYziGoIfhxsBDbdJkitrJ2g70rzevabcpCuKp0L70R95L8iX53leEzqnN4IpreZ7LLcLbrhW5nFq-9AVhgVR-kUh7c0v7Q3I99sYlqxJyYYw-xGLBDVpG0V9DA4942lqKhopo3oQsUN39AlUL8NBgoRzlUMcIEXiEWjBLqHRn206rKvh1JFrCy1WQWowyvS10X72fIZc8N_OMN8XHN-nG_33ANtoswSfumGjZQWsy2UUbo6c0N2fTPfTQyScFgyiZ4JUYmUsMwhIzkwaBYfUWp49AXsfzoniZzomnCst3iV-nhQ55hqMkxrCFlc_gDng02EfD3tVdt2-Vf2GwhIYOxKIu5RHROEFQFsOOcOx5EnTffNWmen7JmQpihyo9lYyViNva9a4GBUobkCjizD1AtWSayEOE40CHgO8y7gQKxkFm2yIAHVKbKBb4Th1dVh4JRSlRDn_KeAmNuLIT6joLoc7q6Gxh-WZkOX6waVRODcuGmYa6S7MpcPP0wy4K7_xaPux2RnA8-qvhMVp3NOQxdMgGqmWzXJ5oyJLx0yI4PwEoLOeU |
linkProvider | Wiley-Blackwell |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw1V3JTsMwEB2xHIADO2LHSCyntI2TOMmBQ8Wish4QVNxCnNiAgBQ1TRH8Er_CR-GJk7IIJC4cOEVKxo6XGc-MPfMMsMZMD79YhuNY3LB5zTJCizPDFW5ke76Qbp5Ie3zCGuf2wYVz0QcvZS6MxofobbihZOTrNQo4bkhX31FDo7BbwTTQIqLyUDw9Kn8t3drfUZO7Tune7tl2wyiuFDAipQdtg1mMh7ZSehHzYjzejB1HIIiZK2tMOUvck35MmVR-USyjuKb6YSkNJxWBHYbcs1S9_TCIF4gjUP_OaQ-rijKqoQ8cmxnol5RItzVaLVv6Wfe9G7QfzeJcr-2NwWs5Ijqc5baSdXglev4CFvlPhmwcRgv7mtS1QExAn0gmYaR5k2b6bToFl_XsKg-SSq7IBzHoCoLYGW2d6UFwg5qk1xifT7p58SJjlbQkEU-C3LdyqPUOCZOY4Cld1sYaSPN0Gs7_pIszMJC0EjELJPYVl7uWx6kvUdV7phn5CLVq2tLzXToHmyULBFGBwo6XgdwFGj-aBmqOApyjOVjtUT5o5JFvaBZLLgqKtScN1KptMgw_VD_byNnhx_LBdr2Jz_nfEq7AUOPs-Cg42j85XIBhqiw8Hf25CAOddiaWlIXW4cu5ZBC4_Gu-egOws0Ox |
linkToPdf | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw1V1LT9wwEB7xkCp6KG0BlZYWIxU4ZUkcx0kOHFYsKygPIQQrbtk4sQFBs2izWQQ_ib_Cn6onThaKQOLCgVOkZOz4MeOZsWc-A_zmToBfXMvzXGExYbtW7Apu-dJPWBBK5ZeJtHv7fOuY_TnxTsbgrs6FMfgQow03lIxyvUYBv0rV2gNoaBIPG5gFWgVU7siba-2u5evbLT23y5S2N482tqzqRgEr0WqQWdzlImZa5yU8SPF0M_U8iRhmvrK59pVEoMKUcqXdolQlqa274WoFpzQBi2MRuLrecZhk3A7xmojW4QiqinJqkA88xi10S2qgW5uu1S39X_U92LOPreJSrbWn4b4eEBPNctEoBqKR3D7BinwfI_YZPlXWNWkacfgCYzL7Ch8753lh3uYz0G0Wp2WIVHZKHgnBUBJEzuibPA-C29MkP8PofDIsi1f5qqSniLyR5G-vBFofkDhLCZ7RFX2sgXQOZ-H4Tbo4BxNZL5PfgKSh5nHfDQQNFSr6wHGSEIFWHaaC0KfzsFpzQJRUGOx4FchlZNCjaaTnKMI5moelEeWVwR15hmahZqKoWnnySK_ZDsfgQ_2zlZIbXiwfbTQ7-Pz-WsJF-HDQake72_s7P2CKavPOhH4uwMSgX8if2jwbiF-lXBDovjVb_QOCFEJg |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Augmenting+collaborative+interaction+with+shared+visualization+of+eye+movement+and+gesture+in+VR&rft.jtitle=Computer+animation+and+virtual+worlds&rft.au=Liu%2C+Yang&rft.au=Zhao%2C+Song&rft.au=Cheng%2C+Shiwei&rft.date=2024-05-01&rft.pub=Wiley+Subscription+Services%2C+Inc&rft.issn=1546-4261&rft.eissn=1546-427X&rft.volume=35&rft.issue=3&rft_id=info:doi/10.1002%2Fcav.2264&rft.externalDBID=NO_FULL_TEXT |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1546-4261&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1546-4261&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1546-4261&client=summon |