A Method for Measuring Contact Points in Human–Object Interaction Utilizing Infrared Cameras
This article presents a novel method for measuring contact points in human–object interaction. Research in multiple prehension-related fields, e.g., action planning, affordance, motor function, ergonomics, and robotic grasping, benefits from accurate and precise measurements of contact points betwee...
Saved in:
Published in | Frontiers in robotics and AI Vol. 8; p. 800131 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
Switzerland
Frontiers Media S.A
14.02.2022
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | This article presents a novel method for measuring contact points in human–object interaction. Research in multiple prehension-related fields, e.g., action planning, affordance, motor function, ergonomics, and robotic grasping, benefits from accurate and precise measurements of contact points between a subject’s hands and objects. During interaction, the subject’s hands occlude the contact points, which poses a major challenge for direct optical measurement methods. Our method solves the occlusion problem by exploiting thermal energy transfer from the subject’s hand to the object surface during interaction. After the interaction, we measure the heat emitted by the object surface with four high-resolution infrared cameras surrounding the object. A computer-vision algorithm detects the areas in the infrared images where the subject’s fingers have touched the object. A structured light 3D scanner produces a point cloud of the scene, which enables the localization of the object in relation to the infrared cameras. We then use the localization result to project the detected contact points from the infrared camera images to the surface of the 3D model of the object. Data collection with this method is fast, unobtrusive, contactless, markerless, and automated. The method enables accurate measurement of contact points in non-trivially complex objects. Furthermore, the method is extendable to measuring surface contact areas, or patches, instead of contact points. In this article, we present the method and sample grasp measurement results with publicly available objects. |
---|---|
AbstractList | This article presents a novel method for measuring contact points in human–object interaction. Research in multiple prehension-related fields, e.g., action planning, affordance, motor function, ergonomics, and robotic grasping, benefits from accurate and precise measurements of contact points between a subject’s hands and objects. During interaction, the subject’s hands occlude the contact points, which poses a major challenge for direct optical measurement methods. Our method solves the occlusion problem by exploiting thermal energy transfer from the subject’s hand to the object surface during interaction. After the interaction, we measure the heat emitted by the object surface with four high-resolution infrared cameras surrounding the object. A computer-vision algorithm detects the areas in the infrared images where the subject’s fingers have touched the object. A structured light 3D scanner produces a point cloud of the scene, which enables the localization of the object in relation to the infrared cameras. We then use the localization result to project the detected contact points from the infrared camera images to the surface of the 3D model of the object. Data collection with this method is fast, unobtrusive, contactless, markerless, and automated. The method enables accurate measurement of contact points in non-trivially complex objects. Furthermore, the method is extendable to measuring surface contact areas, or patches, instead of contact points. In this article, we present the method and sample grasp measurement results with publicly available objects. This article presents a novel method for measuring contact points in human-object interaction. Research in multiple prehension-related fields, e.g., action planning, affordance, motor function, ergonomics, and robotic grasping, benefits from accurate and precise measurements of contact points between a subject's hands and objects. During interaction, the subject's hands occlude the contact points, which poses a major challenge for direct optical measurement methods. Our method solves the occlusion problem by exploiting thermal energy transfer from the subject's hand to the object surface during interaction. After the interaction, we measure the heat emitted by the object surface with four high-resolution infrared cameras surrounding the object. A computer-vision algorithm detects the areas in the infrared images where the subject's fingers have touched the object. A structured light 3D scanner produces a point cloud of the scene, which enables the localization of the object in relation to the infrared cameras. We then use the localization result to project the detected contact points from the infrared camera images to the surface of the 3D model of the object. Data collection with this method is fast, unobtrusive, contactless, markerless, and automated. The method enables accurate measurement of contact points in non-trivially complex objects. Furthermore, the method is extendable to measuring surface contact areas, or patches, instead of contact points. In this article, we present the method and sample grasp measurement results with publicly available objects.This article presents a novel method for measuring contact points in human-object interaction. Research in multiple prehension-related fields, e.g., action planning, affordance, motor function, ergonomics, and robotic grasping, benefits from accurate and precise measurements of contact points between a subject's hands and objects. During interaction, the subject's hands occlude the contact points, which poses a major challenge for direct optical measurement methods. Our method solves the occlusion problem by exploiting thermal energy transfer from the subject's hand to the object surface during interaction. After the interaction, we measure the heat emitted by the object surface with four high-resolution infrared cameras surrounding the object. A computer-vision algorithm detects the areas in the infrared images where the subject's fingers have touched the object. A structured light 3D scanner produces a point cloud of the scene, which enables the localization of the object in relation to the infrared cameras. We then use the localization result to project the detected contact points from the infrared camera images to the surface of the 3D model of the object. Data collection with this method is fast, unobtrusive, contactless, markerless, and automated. The method enables accurate measurement of contact points in non-trivially complex objects. Furthermore, the method is extendable to measuring surface contact areas, or patches, instead of contact points. In this article, we present the method and sample grasp measurement results with publicly available objects. |
Author | Hakala, Jussi Häkkinen, Jukka |
AuthorAffiliation | Department of Psychology and Logopedics, Faculty of Medicine, University of Helsinki , Helsinki , Finland |
AuthorAffiliation_xml | – name: Department of Psychology and Logopedics, Faculty of Medicine, University of Helsinki , Helsinki , Finland |
Author_xml | – sequence: 1 givenname: Jussi surname: Hakala fullname: Hakala, Jussi – sequence: 2 givenname: Jukka surname: Häkkinen fullname: Häkkinen, Jukka |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/35237668$$D View this record in MEDLINE/PubMed |
BookMark | eNpVkktuFDEQhlsoiISQA7BBvWQzg1_jxwYpGgEZKSgsyBbLXV098ajbDrYbCVbcgRtyEjyZECUrl6o-fyXL_8vmKMSATfOakiXn2rwbUuzKkhFGl5oQyumz5oQxIxeGCnH0qD5uznLekcqstOBKvWiO-YpxJaU-ab6dt5-x3MS-HWKqpctz8mHbrmMoDkr7JfpQcutDezFPLvz9_eeq22EdbELBVAkfQ3td_Oh_7a9twpBcwr5du6mO86vm-eDGjGf352lz_fHD1_XF4vLq02Z9frkAQVVZ9EQBZ0AcM0gJEQhGa9L3BrREBOQwEJQgRX0ABwZIkTijHHCnahf4abM5ePvodvY2-cmlnzY6b-8aMW2tS8XDiNY5RUBpybEzQtLeGNFpachKcgXAaXW9P7hu527CHjCU5MYn0qeT4G_sNv6wWmvOKKmCt_eCFL_PmIudfAYcRxcwztkyyVdCSapERd883vWw5P8HVYAeAEgx54TDA0KJ3efA3uXA7nNgDzng_wDskKgW |
Cites_doi | 10.1016/j.concog.2006.06.003 10.1016/j.foodres.2020.109025 10.1007/s43154-020-00021-6 10.1037/0096-1523.5.4.692 10.1037/0096-1523.25.4.927 10.1007/s10973-020-09665-0 10.1371/journal.pone.0204052 10.1037/xhp0000492 10.1109/LRA.2018.2852777 10.1016/j.jtherbio.2019.01.009 10.1111/eci.13474 10.1016/j.humov.2004.08.004 10.1109/THMS.2017.2681423 10.1177/0278364917700714 10.1007/s002210050658 10.1126/scirobotics.aay4663 10.3200/JMBR.40.5.446-476 10.1007/s00221-006-0840-9 10.1109/lra.2020.2974391 10.1109/MRA.2004.1371616 10.1126/scirobotics.abd9461 10.1007/s13198-020-00958-z 10.1016/j.enbuild.2021.110989 10.1109/ICAR.2015.7251504 10.1007/BFb0052986 10.1016/s0166-4328(05)80253-0 10.1109/TASE.2015.2396014 10.1007/s00221-005-0277-6 10.1167/13.8.23 |
ContentType | Journal Article |
Copyright | Copyright © 2022 Hakala and Häkkinen. Copyright © 2022 Hakala and Häkkinen. 2022 Hakala and Häkkinen |
Copyright_xml | – notice: Copyright © 2022 Hakala and Häkkinen. – notice: Copyright © 2022 Hakala and Häkkinen. 2022 Hakala and Häkkinen |
DBID | AAYXX CITATION NPM 7X8 5PM DOA |
DOI | 10.3389/frobt.2021.800131 |
DatabaseName | CrossRef PubMed MEDLINE - Academic PubMed Central (Full Participant titles) DOAJ Directory of Open Access Journals |
DatabaseTitle | CrossRef PubMed MEDLINE - Academic |
DatabaseTitleList | CrossRef MEDLINE - Academic PubMed |
Database_xml | – sequence: 1 dbid: DOA name: DOAJ Directory of Open Access Journals url: https://www.doaj.org/ sourceTypes: Open Website – sequence: 2 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Engineering |
DocumentTitleAlternate | Hakala and Häkkinen |
EISSN | 2296-9144 |
ExternalDocumentID | oai_doaj_org_article_aa70c7863eb9461d994b86905637cc31 PMC8883210 35237668 10_3389_frobt_2021_800131 |
Genre | Journal Article Review |
GrantInformation_xml | – fundername: ; grantid: 4331/31/2016 |
GroupedDBID | 53G 5VS 9T4 AAFWJ AAYXX ACGFS ACXDI ADBBV AFPKN ALMA_UNASSIGNED_HOLDINGS BCNDV CITATION GROUPED_DOAJ KQ8 M~E OK1 PGMZT RPM IAO ICD IEA IPNFZ ISR NPM RIG 7X8 5PM |
ID | FETCH-LOGICAL-c417t-d07c32c0a29e1004ec9880dd9c86eece3cf0e6c648433c2ce1e0a97ac3a7c64c3 |
IEDL.DBID | DOA |
ISSN | 2296-9144 |
IngestDate | Wed Aug 27 01:29:35 EDT 2025 Thu Aug 21 14:09:28 EDT 2025 Thu Jul 10 22:24:03 EDT 2025 Thu Jan 02 22:56:32 EST 2025 Tue Jul 01 03:44:30 EDT 2025 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Keywords | grasping touch infrared camera contact point prehension movements |
Language | English |
License | Copyright © 2022 Hakala and Häkkinen. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c417t-d07c32c0a29e1004ec9880dd9c86eece3cf0e6c648433c2ce1e0a97ac3a7c64c3 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 ObjectType-Review-3 content type line 23 Reviewed by: Kai Wang, China United Network Communications Group, China Alexander Rassau, Edith Cowan University, Australia Edited by: Antonios Gasteratos, Democritus University of Thrace, Greece This article was submitted to Robot and Machine Vision, a section of the journal Frontiers in Robotics and AI Ruediger Dillmann, Karlsruhe Institute of Technology (KIT), Germany |
OpenAccessLink | https://doaj.org/article/aa70c7863eb9461d994b86905637cc31 |
PMID | 35237668 |
PQID | 2635476174 |
PQPubID | 23479 |
ParticipantIDs | doaj_primary_oai_doaj_org_article_aa70c7863eb9461d994b86905637cc31 pubmedcentral_primary_oai_pubmedcentral_nih_gov_8883210 proquest_miscellaneous_2635476174 pubmed_primary_35237668 crossref_primary_10_3389_frobt_2021_800131 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2022-02-14 |
PublicationDateYYYYMMDD | 2022-02-14 |
PublicationDate_xml | – month: 02 year: 2022 text: 2022-02-14 day: 14 |
PublicationDecade | 2020 |
PublicationPlace | Switzerland |
PublicationPlace_xml | – name: Switzerland |
PublicationTitle | Frontiers in robotics and AI |
PublicationTitleAlternate | Front Robot AI |
PublicationYear | 2022 |
Publisher | Frontiers Media S.A |
Publisher_xml | – name: Frontiers Media S.A |
References | Kleinholdermann (B28) 2013; 13 Gentilucci (B21) 1992; 47 Goldfeder (B23) 2009 Gizińska (B22) 2021; 145 Kita (B26) 1998; 1371 Cai (B4) 2017; 47 Gabellieri (B20) 2020; 5 Duan (B15) 2021; 244 Bridgeman (B3) 1979; 5 Charusta (B8) 2012 Chu (B11) 2018; 3 Cressman (B12) 2007; 16 Khera (B25) 2020; 11 Cesari (B7) 1999; 25 Choi (B10) 2004; 23 Calli (B5) 2017; 36 ElMasry (B17) 2020; 131 Fan (B18) 2006; 171 Lyubanenko (B29) 2017 Chen (B9) 2018; 44 Song (B37) 2016; 13 Zatsiorsky (B40) 2008; 40 Oikonomidis (B33) 2011 Cui (B13) 2021; 6 Ansuini (B1) 2007; 180 Martinez‐Jimenez (B30) 2021; 51 Yang (B39) 2015 Bock (B2) 1999; 125 Naeemabadi (B32) 2018; 13 Calli (B6) 2015 Miller (B31) 2004; 11 Sridhar (B38) 2016 Ivašić-Kos (B24) 2019 (B19) 2016 Shilco (B35) 2019; 80 Siddharth (B36) 2016 (B34) 2018 Do (B14) 2018 Edmonds (B16) 2019; 4 Kleeberger (B27) 2020; 1 |
References_xml | – start-page: 4162 year: 2012 ident: B8 article-title: Independent Contact Regions Based on a Patch Contact Model – volume: 16 start-page: 265 year: 2007 ident: B12 article-title: On-Line Control of Pointing Is Modified by Unseen Visual Shapes publication-title: Conscious. Cogn. doi: 10.1016/j.concog.2006.06.003 – volume: 131 start-page: 109025 year: 2020 ident: B17 article-title: Emerging thermal Imaging Techniques for Seed Quality Evaluation: Principles and Applications publication-title: Food Res. Int. doi: 10.1016/j.foodres.2020.109025 – volume: 1 start-page: 239 year: 2020 ident: B27 article-title: A Survey on Learning-Based Robotic Grasping publication-title: Curr. Robot Rep. doi: 10.1007/s43154-020-00021-6 – volume: 5 start-page: 692 year: 1979 ident: B3 article-title: Relation between Cognitive and Motor-Oriented Systems of Visual Position Perception publication-title: J. Exp. Psychol. Hum. Perception Perform. doi: 10.1037/0096-1523.5.4.692 – volume: 25 start-page: 927 year: 1999 ident: B7 article-title: The Scaling of Human Grip Configurations publication-title: J. Exp. Psychol. Hum. Perception Perform. doi: 10.1037/0096-1523.25.4.927 – volume-title: FLIR A65 FOV 45 (30 Hz, Ver. 2016) Product Documentation year: 2016 ident: B19 – start-page: 63 volume-title: Multi-camera Finger Tracking and 3D Trajectory Reconstruction for HCI Studies year: 2017 ident: B29 – volume: 145 start-page: 77 year: 2021 ident: B22 article-title: Thermal Imaging for Detecting Temperature Changes within the Rheumatoid Foot publication-title: J. Therm. Anal. Calorim. doi: 10.1007/s10973-020-09665-0 – volume: 13 start-page: e0204052 year: 2018 ident: B32 article-title: Investigating the Impact of a Motion Capture System on Microsoft Kinect V2 Recordings: A Caution for Using the Technologies Together publication-title: PLoS One doi: 10.1371/journal.pone.0204052 – start-page: 1 year: 2011 ident: B33 article-title: Efficient Model-Based 3D Tracking of Hand Articulations Using Kinect – start-page: 400 year: 2015 ident: B39 article-title: Grasp Type Revisited: A Modern Perspective on a Classical Feature for Vision – volume: 44 start-page: 953 year: 2018 ident: B9 article-title: Volitional and Automatic Control of the Hand when Reaching to Grasp Objects publication-title: J. Exp. Psychol. Hum. Perception Perform. doi: 10.1037/xhp0000492 – volume: 3 start-page: 3355 year: 2018 ident: B11 article-title: Real-World Multiobject, Multigrasp Detection publication-title: IEEE Robot. Autom. Lett. doi: 10.1109/LRA.2018.2852777 – start-page: 20 year: 2019 ident: B24 article-title: Human Detection in thermal Imaging Using YOLO – volume: 80 start-page: 82 year: 2019 ident: B35 article-title: Normative Surface Skin Temperature Changes Due to Blood Redistribution: A Prospective Study publication-title: J. Therm. Biol. doi: 10.1016/j.jtherbio.2019.01.009 – volume: 51 start-page: e13474 year: 2021 ident: B30 article-title: Diagnostic Accuracy of Infrared thermal Imaging for Detecting COVID‐19 Infection in Minimally Symptomatic Patients publication-title: Eur. J. Clin. Invest. doi: 10.1111/eci.13474 – year: 2018 ident: B34 article-title: PhoXi® 3D Scanner User Manual and Installation Instructions 01/2018, Rev – volume: 23 start-page: 785 year: 2004 ident: B10 article-title: Scaling Affordances for Human Reach Actions publication-title: Hum. Mov. Sci. doi: 10.1016/j.humov.2004.08.004 – volume: 47 start-page: 524 year: 2017 ident: B4 article-title: An Ego-Vision System for Hand Grasp Analysis publication-title: IEEE Trans. Human-Mach. Syst. doi: 10.1109/THMS.2017.2681423 – volume: 36 start-page: 261 year: 2017 ident: B5 article-title: Yale-CMU-Berkeley Dataset for Robotic Manipulation Research publication-title: Int. J. Robotics Res. doi: 10.1177/0278364917700714 – volume: 125 start-page: 61 year: 1999 ident: B2 article-title: Reprogramming of Grip Aperture in a Double-Step Virtual Grasping Paradigm publication-title: Exp. Brain Res. doi: 10.1007/s002210050658 – volume: 4 start-page: eaay4663 year: 2019 ident: B16 article-title: A Tale of Two Explanations: Enhancing Human Trust by Explaining Robot Behavior publication-title: Sci. Robot doi: 10.1126/scirobotics.aay4663 – volume: 40 start-page: 446 year: 2008 ident: B40 article-title: Multifinger Prehension: An Overview publication-title: J. Mot. Behav. doi: 10.3200/JMBR.40.5.446-476 – volume: 180 start-page: 85 year: 2007 ident: B1 article-title: Control of Hand Shaping in Response to Object Shape Perturbation publication-title: Exp. Brain Res. doi: 10.1007/s00221-006-0840-9 – volume: 5 start-page: 2808 year: 2020 ident: B20 article-title: Grasp it Like a Pro: Grasp of Unknown Objects with Robotic Hands Based on Skilled Human Expertise publication-title: IEEE Robot. Autom. Lett. doi: 10.1109/lra.2020.2974391 – volume: 11 start-page: 110 year: 2004 ident: B31 article-title: GraspIt! publication-title: IEEE Robot. Automat. Mag. doi: 10.1109/MRA.2004.1371616 – volume: 6 start-page: eabd9461 year: 2021 ident: B13 article-title: Toward Next-Generation Learned Robot Manipulation publication-title: Sci. Robot doi: 10.1126/scirobotics.abd9461 – volume: 11 start-page: 614 year: 2020 ident: B25 article-title: Valve Regulated lead Acid Battery Diagnostic System Based on Infrared thermal Imaging and Fuzzy Algorithm publication-title: Int. J. Syst. Assur. Eng. Manag. doi: 10.1007/s13198-020-00958-z – start-page: 2545 year: 2016 ident: B36 article-title: Driver Hand Localization and Grasp Analysis: A Vision-Based Real-Time Approach – volume: 244 start-page: 110989 year: 2021 ident: B15 article-title: Real-Time Surveillance-Video-Based Personalized thermal comfort Recognition publication-title: Energy and Buildings doi: 10.1016/j.enbuild.2021.110989 – year: 2015 ident: B6 article-title: The YCB Object and Model Set: Towards Common Benchmarks for Manipulation Research doi: 10.1109/ICAR.2015.7251504 – volume: 1371 start-page: 23 year: 1998 ident: B26 article-title: Movement Phases in Signs and Co-Speech Gestures, and Their Transcription by Human Coders publication-title: Lect. Notes Comput. Sci. doi: 10.1007/BFb0052986 – volume: 47 start-page: 71 year: 1992 ident: B21 article-title: Temporal Coupling between Transport and Grasp Components during Prehension Movements: Effects of Visual Perturbation publication-title: Behav. Brain Res. doi: 10.1016/s0166-4328(05)80253-0 – volume: 13 start-page: 798 year: 2016 ident: B37 article-title: Learning to Detect Visual Grasp Affordance publication-title: IEEE Trans. Automat. Sci. Eng. doi: 10.1109/TASE.2015.2396014 – volume: 171 start-page: 283 year: 2006 ident: B18 article-title: Control of Hand Orientation and Arm Movement during Reach and Grasp publication-title: Exp. Brain Res. doi: 10.1007/s00221-005-0277-6 – start-page: 294 volume-title: Lecture Notes in Computer Science year: 2016 ident: B38 article-title: Real-Time Joint Tracking of a Hand Manipulating an Object from RGB-D Input – start-page: 1710 year: 2009 ident: B23 article-title: The Columbia Grasp Database – start-page: 5882 year: 2018 ident: B14 article-title: Affordancenet: An End-To-End Deep Learning Approach for Object Affordance Detection – volume: 13 start-page: 23 year: 2013 ident: B28 article-title: Human Grasp point Selection publication-title: J. Vis. doi: 10.1167/13.8.23 |
SSID | ssj0001584377 |
Score | 2.173635 |
SecondaryResourceType | review_article |
Snippet | This article presents a novel method for measuring contact points in human–object interaction. Research in multiple prehension-related fields, e.g., action... This article presents a novel method for measuring contact points in human-object interaction. Research in multiple prehension-related fields, e.g., action... |
SourceID | doaj pubmedcentral proquest pubmed crossref |
SourceType | Open Website Open Access Repository Aggregation Database Index Database |
StartPage | 800131 |
SubjectTerms | contact point grasping infrared camera prehension movements Robotics and AI touch |
Title | A Method for Measuring Contact Points in Human–Object Interaction Utilizing Infrared Cameras |
URI | https://www.ncbi.nlm.nih.gov/pubmed/35237668 https://www.proquest.com/docview/2635476174 https://pubmed.ncbi.nlm.nih.gov/PMC8883210 https://doaj.org/article/aa70c7863eb9461d994b86905637cc31 |
Volume | 8 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1Nb9QwELVQT-WAoHwFKDISJ6TQxHb8cSwVVUEqcGClnrCciS0ioSzaTS-c-A_8w_4SZuxttYuQuHCL7ESx3sTxm7HnDWMvkSWYrjOmTgZErZBz1E6GoXaqTTakLoScKHz-QZ8t1PuL7mKr1BedCSvywAW4oxBMA8ZqGXundDs4p3qqotRpaQByBrXANW_LmSr5wVZJY8o2Jnph7iitlj2dnRTta5tFZnYWoqzX_zeS-edZya3F5_Quu7Nhjfy4jPYeuxWnA3Z7S0vwPvtyzM9zNWiONBQvKfSHHZzkpwLM_NNynOY1Hyee4_ZXP3997CkGw3NMsKQ38MU8fht_0GPvprSis-n8JFDYav2ALU7ffj45qzfFE2pQrZnroTEgBTRBuEiqcBEcTtVhcGB1jBAlpCZq0AqBkiAgtrEJzgSQwWAryIdsb1pO8THjIikQITQWEjqPqUGzo58lRHDRBd13FXt1jaT_XjQyPPoWBLvPsHuC3RfYK_aGsL65keStcwMa3W-M7v9l9Iq9uLaUx-lAexxhisvLtSdtHWWQlqmKPSqWu3kVck38nWpbMbNj052x7PZM49csuW0tVXRqnvyPwT9l-4JyKKiqjHrG9ubVZTxEZjP3z_NH_Bt5Ofgv |
linkProvider | Directory of Open Access Journals |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=A+Method+for+Measuring+Contact+Points+in+Human%E2%80%93Object+Interaction+Utilizing+Infrared+Cameras&rft.jtitle=Frontiers+in+robotics+and+AI&rft.au=Hakala%2C+Jussi&rft.au=H%C3%A4kkinen%2C+Jukka&rft.date=2022-02-14&rft.issn=2296-9144&rft.eissn=2296-9144&rft.volume=8&rft_id=info:doi/10.3389%2Ffrobt.2021.800131&rft.externalDBID=n%2Fa&rft.externalDocID=10_3389_frobt_2021_800131 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2296-9144&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2296-9144&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2296-9144&client=summon |