Driver’s Head Pose and Gaze Zone Estimation Based on Multi-Zone Templates Registration and Multi-Frame Point Cloud Fusion
Head pose and eye gaze are vital clues for analysing a driver’s visual attention. Previous approaches achieve promising results from point clouds in constrained conditions. However, these approaches face challenges in the complex naturalistic driving scene. One of the challenges is that the collecte...
Saved in:
Published in | Sensors (Basel, Switzerland) Vol. 22; no. 9; p. 3154 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
Switzerland
MDPI AG
20.04.2022
MDPI |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | Head pose and eye gaze are vital clues for analysing a driver’s visual attention. Previous approaches achieve promising results from point clouds in constrained conditions. However, these approaches face challenges in the complex naturalistic driving scene. One of the challenges is that the collected point cloud data under non-uniform illumination and large head rotation is prone to partial facial occlusion. It causes bad transformation during failed template matching or incorrect feature extraction. In this paper, a novel estimation method is proposed for predicting accurate driver head pose and gaze zone using an RGB-D camera, with an effective point cloud fusion and registration strategy. In the fusion step, to reduce bad transformation, continuous multi-frame point clouds are registered and fused to generate a stable point cloud. In the registration step, to reduce reliance on template registration, multiple point clouds in the nearest neighbor gaze zone are utilized as a template point cloud. A coarse transformation computed by the normal distributions transform is used as the initial transformation, and updated with particle filter. A gaze zone estimator is trained by combining the head pose and eye image features, in which the head pose is predicted by point cloud registration, and the eye image features are extracted via multi-scale spare coding. Extensive experiments demonstrate that the proposed strategy achieves better results on head pose tracking, and also has a low error on gaze zone classification. |
---|---|
AbstractList | Head pose and eye gaze are vital clues for analysing a driver's visual attention. Previous approaches achieve promising results from point clouds in constrained conditions. However, these approaches face challenges in the complex naturalistic driving scene. One of the challenges is that the collected point cloud data under non-uniform illumination and large head rotation is prone to partial facial occlusion. It causes bad transformation during failed template matching or incorrect feature extraction. In this paper, a novel estimation method is proposed for predicting accurate driver head pose and gaze zone using an RGB-D camera, with an effective point cloud fusion and registration strategy. In the fusion step, to reduce bad transformation, continuous multi-frame point clouds are registered and fused to generate a stable point cloud. In the registration step, to reduce reliance on template registration, multiple point clouds in the nearest neighbor gaze zone are utilized as a template point cloud. A coarse transformation computed by the normal distributions transform is used as the initial transformation, and updated with particle filter. A gaze zone estimator is trained by combining the head pose and eye image features, in which the head pose is predicted by point cloud registration, and the eye image features are extracted via multi-scale spare coding. Extensive experiments demonstrate that the proposed strategy achieves better results on head pose tracking, and also has a low error on gaze zone classification.Head pose and eye gaze are vital clues for analysing a driver's visual attention. Previous approaches achieve promising results from point clouds in constrained conditions. However, these approaches face challenges in the complex naturalistic driving scene. One of the challenges is that the collected point cloud data under non-uniform illumination and large head rotation is prone to partial facial occlusion. It causes bad transformation during failed template matching or incorrect feature extraction. In this paper, a novel estimation method is proposed for predicting accurate driver head pose and gaze zone using an RGB-D camera, with an effective point cloud fusion and registration strategy. In the fusion step, to reduce bad transformation, continuous multi-frame point clouds are registered and fused to generate a stable point cloud. In the registration step, to reduce reliance on template registration, multiple point clouds in the nearest neighbor gaze zone are utilized as a template point cloud. A coarse transformation computed by the normal distributions transform is used as the initial transformation, and updated with particle filter. A gaze zone estimator is trained by combining the head pose and eye image features, in which the head pose is predicted by point cloud registration, and the eye image features are extracted via multi-scale spare coding. Extensive experiments demonstrate that the proposed strategy achieves better results on head pose tracking, and also has a low error on gaze zone classification. Head pose and eye gaze are vital clues for analysing a driver's visual attention. Previous approaches achieve promising results from point clouds in constrained conditions. However, these approaches face challenges in the complex naturalistic driving scene. One of the challenges is that the collected point cloud data under non-uniform illumination and large head rotation is prone to partial facial occlusion. It causes bad transformation during failed template matching or incorrect feature extraction. In this paper, a novel estimation method is proposed for predicting accurate driver head pose and gaze zone using an RGB-D camera, with an effective point cloud fusion and registration strategy. In the fusion step, to reduce bad transformation, continuous multi-frame point clouds are registered and fused to generate a stable point cloud. In the registration step, to reduce reliance on template registration, multiple point clouds in the nearest neighbor gaze zone are utilized as a template point cloud. A coarse transformation computed by the normal distributions transform is used as the initial transformation, and updated with particle filter. A gaze zone estimator is trained by combining the head pose and eye image features, in which the head pose is predicted by point cloud registration, and the eye image features are extracted via multi-scale spare coding. Extensive experiments demonstrate that the proposed strategy achieves better results on head pose tracking, and also has a low error on gaze zone classification. |
Author | Yuan, Guoliang Wang, Yafei Fu, Xianping |
AuthorAffiliation | School of Information Science and Technology, Dalian Maritime University, Dalian 116026, China; wangyafei@dlmu.edu.cn (Y.W.); fxp@dlmu.edu.cn (X.F.) |
AuthorAffiliation_xml | – name: School of Information Science and Technology, Dalian Maritime University, Dalian 116026, China; wangyafei@dlmu.edu.cn (Y.W.); fxp@dlmu.edu.cn (X.F.) |
Author_xml | – sequence: 1 givenname: Yafei surname: Wang fullname: Wang, Yafei – sequence: 2 givenname: Guoliang surname: Yuan fullname: Yuan, Guoliang – sequence: 3 givenname: Xianping surname: Fu fullname: Fu, Xianping |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/35590843$$D View this record in MEDLINE/PubMed |
BookMark | eNptks1u1DAQgC1URH_gwAsgS1zoYal_E_uCBEu3rVQEQuXCxXKcyeJVEi92Uoly4TV4PZ4E76as2oqTrZnPnz2eOUR7fegBoeeUvOZck5PEGNGcSvEIHVDBxEzlwN6d_T46TGlFCOOcqydon0upiRL8AP18H_01xD-_fid8DrbGn0ICbPsan9kbwF_zRfg0Db6zgw89fmcT1DhvPozt4Gfb9BV069YOkPBnWPo0xAndOCZqEW0HWez7Ac_bMNZ4MaaMPEWPG9smeHa7HqEvi9Or-fns8uPZxfzt5cyJQg-zSjleNYLUliopG10UrmaNdhqE01SAlMDBQqMogLYFzyFSk0paqgtlieBH6GLy1sGuzDrmYuIPE6w320CIS2Pj4F0LhhHqGguKEagFEUyVZUN1BVrohjPHs-vN5FqPVQe1gz7X296T3s_0_ptZhmujKZGCFlnw6lYQw_cR0mA6nxy0re0hjMmwoihLTaRiGX35AF2FMfb5qzYUp6yg5YZ6cfdFu6f863EGjifAxZBShGaHUGI282N285PZkwes88O2n7kY3_7nxF_Kh8eS |
CitedBy_id | crossref_primary_10_1080_10589759_2024_2409391 crossref_primary_10_1007_s10462_024_10936_7 crossref_primary_10_3390_s23249894 crossref_primary_10_1109_TITS_2024_3444588 crossref_primary_10_1016_j_engappai_2024_108117 crossref_primary_10_1016_j_patcog_2022_109288 |
Cites_doi | 10.1109/ACCESS.2021.3054951 10.1109/IV47402.2020.9304592 10.1049/iet-cvi.2015.0296 10.1109/ITSC.2017.8317870 10.1109/TPAMI.2015.2500221 10.1109/MIS.2016.47 10.1109/ICIP.2015.7351101 10.1109/ICCE.2018.8326308 10.1109/CVPR42600.2020.01138 10.1016/0262-8856(92)90066-C 10.1109/CVPR.2019.00733 10.1109/ICCV.2015.416 10.1109/TCE.2021.3127006 10.1109/ITSC.2012.6338678 10.1109/TITS.2021.3075350 10.1109/TITS.2019.2892155 10.1109/ICCV.2013.184 10.1109/TITS.2016.2526050 10.1109/CVPR.2011.5995458 10.1109/AFGR.2008.4813369 10.1109/TITS.2014.2332613 10.1007/s00371-020-01934-1 10.3390/s19061287 10.1109/FG.2019.8756592 10.1109/CVPRW.2017.155 10.3390/electronics10121480 10.1109/TITS.2015.2462084 10.3390/s21186262 10.1109/FG.2018.00019 10.1016/j.knosys.2017.10.010 10.37247/PAELEC.1.22.12 10.1109/ICIBA50161.2020.9276968 10.1109/CVPRW.2012.6239236 10.1109/CVPR.2017.583 10.1016/j.knosys.2021.107630 10.3390/s19245540 10.3390/s18051641 10.1016/j.knosys.2016.07.038 10.1109/TITS.2015.2396031 10.1109/ICETECH.2016.7569378 10.1109/ICRA.2018.8461063 10.1109/ROBOT.2009.5152712 |
ContentType | Journal Article |
Copyright | 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. 2022 by the authors. 2022 |
Copyright_xml | – notice: 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. – notice: 2022 by the authors. 2022 |
DBID | AAYXX CITATION CGR CUY CVF ECM EIF NPM 3V. 7X7 7XB 88E 8FI 8FJ 8FK ABUWG AFKRA AZQEC BENPR CCPQU DWQXO FYUFA GHDGH K9. M0S M1P PHGZM PHGZT PIMPY PJZUB PKEHL PPXIY PQEST PQQKQ PQUKI PRINS 7X8 5PM DOA |
DOI | 10.3390/s22093154 |
DatabaseName | CrossRef Medline MEDLINE MEDLINE (Ovid) MEDLINE MEDLINE PubMed ProQuest Central (Corporate) Health & Medical Collection ProQuest Central (purchase pre-March 2016) Medical Database (Alumni Edition) Hospital Premium Collection Hospital Premium Collection (Alumni Edition) ProQuest Central (Alumni) (purchase pre-March 2016) ProQuest Central (Alumni) ProQuest Central UK/Ireland ProQuest Central Essentials ProQuest Central ProQuest One Community College ProQuest Central Proquest Health Research Premium Collection Health Research Premium Collection (Alumni) ProQuest Health & Medical Complete (Alumni) ProQuest Health & Medical Collection Medical Database ProQuest Central Premium ProQuest One Academic (New) Publicly Available Content Database ProQuest Health & Medical Research Collection ProQuest One Academic Middle East (New) ProQuest One Health & Nursing ProQuest One Academic Eastern Edition (DO NOT USE) ProQuest One Academic ProQuest One Academic UKI Edition ProQuest Central China MEDLINE - Academic PubMed Central (Full Participant titles) DOAJ Directory of Open Access Journals |
DatabaseTitle | CrossRef MEDLINE Medline Complete MEDLINE with Full Text PubMed MEDLINE (Ovid) Publicly Available Content Database ProQuest One Academic Middle East (New) ProQuest Central Essentials ProQuest Health & Medical Complete (Alumni) ProQuest Central (Alumni Edition) ProQuest One Community College ProQuest One Health & Nursing ProQuest Central China ProQuest Central ProQuest Health & Medical Research Collection Health Research Premium Collection Health and Medicine Complete (Alumni Edition) ProQuest Central Korea Health & Medical Research Collection ProQuest Central (New) ProQuest Medical Library (Alumni) ProQuest One Academic Eastern Edition ProQuest Hospital Collection Health Research Premium Collection (Alumni) ProQuest Hospital Collection (Alumni) ProQuest Health & Medical Complete ProQuest Medical Library ProQuest One Academic UKI Edition ProQuest One Academic ProQuest One Academic (New) ProQuest Central (Alumni) MEDLINE - Academic |
DatabaseTitleList | MEDLINE - Academic MEDLINE Publicly Available Content Database CrossRef |
Database_xml | – sequence: 1 dbid: DOA name: DOAJ Directory of Open Access Journals url: https://www.doaj.org/ sourceTypes: Open Website – sequence: 2 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 3 dbid: EIF name: MEDLINE url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search sourceTypes: Index Database – sequence: 4 dbid: BENPR name: ProQuest Central url: https://www.proquest.com/central sourceTypes: Aggregation Database |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Engineering |
EISSN | 1424-8220 |
ExternalDocumentID | oai_doaj_org_article_201cfae820ed4042877f19be949f32c3 PMC9105416 35590843 10_3390_s22093154 |
Genre | Journal Article |
GrantInformation_xml | – fundername: Research Project of China Disabled Persons' Federation - on Assistive Technology grantid: 2021CDPFAT-09 – fundername: Dalian Science and Technology Innovation Fund grantid: 2019J11CY001 – fundername: Dalian Science and Technology Innovation Fund grantid: 2021JJ12GX028 – fundername: Liaoning Revitalization Talents Program grantid: XLYC1908007 |
GroupedDBID | --- 123 2WC 53G 5VS 7X7 88E 8FE 8FG 8FI 8FJ AADQD AAHBH AAYXX ABDBF ABUWG ACUHS ADBBV ADMLS AENEX AFKRA AFZYC ALIPV ALMA_UNASSIGNED_HOLDINGS BENPR BPHCQ BVXVI CCPQU CITATION CS3 D1I DU5 E3Z EBD ESX F5P FYUFA GROUPED_DOAJ GX1 HH5 HMCUK HYE IAO ITC KQ8 L6V M1P M48 MODMG M~E OK1 OVT P2P P62 PHGZM PHGZT PIMPY PQQKQ PROAC PSQYO RNS RPM TUS UKHRP XSB ~8M 3V. ABJCF ARAPS CGR CUY CVF ECM EIF HCIFZ KB. M7S NPM PDBOC 7XB 8FK AZQEC DWQXO K9. PJZUB PKEHL PPXIY PQEST PQUKI PRINS 7X8 5PM PUEGO |
ID | FETCH-LOGICAL-c469t-b8c3bf40da1855f966cd2f9c9e4c914e55e3eaef81ee9a6314e0d0b5a1968a043 |
IEDL.DBID | M48 |
ISSN | 1424-8220 |
IngestDate | Wed Aug 27 01:30:30 EDT 2025 Thu Aug 21 13:50:58 EDT 2025 Fri Jul 11 02:14:23 EDT 2025 Fri Jul 25 20:19:51 EDT 2025 Wed Feb 19 02:26:27 EST 2025 Tue Jul 01 02:41:51 EDT 2025 Thu Apr 24 22:55:56 EDT 2025 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 9 |
Keywords | point cloud head pose ICP driving environment gaze zone |
Language | English |
License | https://creativecommons.org/licenses/by/4.0 Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c469t-b8c3bf40da1855f966cd2f9c9e4c914e55e3eaef81ee9a6314e0d0b5a1968a043 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
OpenAccessLink | http://journals.scholarsportal.info/openUrl.xqy?doi=10.3390/s22093154 |
PMID | 35590843 |
PQID | 2663126172 |
PQPubID | 2032333 |
ParticipantIDs | doaj_primary_oai_doaj_org_article_201cfae820ed4042877f19be949f32c3 pubmedcentral_primary_oai_pubmedcentral_nih_gov_9105416 proquest_miscellaneous_2667790582 proquest_journals_2663126172 pubmed_primary_35590843 crossref_primary_10_3390_s22093154 crossref_citationtrail_10_3390_s22093154 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 20220420 |
PublicationDateYYYYMMDD | 2022-04-20 |
PublicationDate_xml | – month: 4 year: 2022 text: 20220420 day: 20 |
PublicationDecade | 2020 |
PublicationPlace | Switzerland |
PublicationPlace_xml | – name: Switzerland – name: Basel |
PublicationTitle | Sensors (Basel, Switzerland) |
PublicationTitleAlternate | Sensors (Basel) |
PublicationYear | 2022 |
Publisher | MDPI AG MDPI |
Publisher_xml | – name: MDPI AG – name: MDPI |
References | ref_50 ref_14 ref_13 ref_12 ref_11 ref_10 Chen (ref_24) 1992; 10 ref_19 ref_18 ref_17 ref_16 ref_15 Armingol (ref_8) 2014; 15 Wang (ref_37) 2018; 139 ref_25 Wang (ref_36) 2016; 110 ref_23 Yang (ref_45) 2021; 67 ref_22 ref_21 ref_20 Fridman (ref_41) 2016; 10 Vicente (ref_48) 2015; 16 ref_28 ref_27 ref_26 Li (ref_47) 2015; 38 Kaplan (ref_1) 2015; 16 ref_35 ref_34 ref_33 ref_32 Lundgren (ref_38) 2016; 17 ref_30 Yuan (ref_39) 2022; 235 Yuan (ref_31) 2021; 9 Zhang (ref_5) 2021; 37 Chiou (ref_44) 2019; 21 ref_46 ref_43 ref_42 ref_40 ref_3 ref_2 ref_49 ref_9 ref_4 ref_7 Fridman (ref_29) 2016; 31 ref_6 |
References_xml | – volume: 9 start-page: 18560 year: 2021 ident: ref_31 article-title: A Novel Driving Behavior Learning and Visualization Method with Natural Gaze Prediction publication-title: IEEE Access doi: 10.1109/ACCESS.2021.3054951 – ident: ref_10 doi: 10.1109/IV47402.2020.9304592 – volume: 10 start-page: 308 year: 2016 ident: ref_41 article-title: ‘Owl’and ‘Lizard’: Patterns of head pose and eye pose in driver gaze classification publication-title: IET Comput. Vis. doi: 10.1049/iet-cvi.2015.0296 – ident: ref_28 doi: 10.1109/ITSC.2017.8317870 – volume: 38 start-page: 1922 year: 2015 ident: ref_47 article-title: Real-time head pose tracking with online face template reconstruction publication-title: IEEE Trans. Pattern Anal. Mach. Intell. doi: 10.1109/TPAMI.2015.2500221 – ident: ref_23 – volume: 31 start-page: 49 year: 2016 ident: ref_29 article-title: Driver gaze region estimation without use of eye movement publication-title: IEEE Intell. Syst. doi: 10.1109/MIS.2016.47 – ident: ref_18 doi: 10.1109/ICIP.2015.7351101 – ident: ref_32 doi: 10.1109/ICCE.2018.8326308 – ident: ref_22 doi: 10.1109/CVPR42600.2020.01138 – volume: 10 start-page: 145 year: 1992 ident: ref_24 article-title: Object modelling by registration of multiple range images publication-title: Image Vis. Comput. doi: 10.1016/0262-8856(92)90066-C – ident: ref_20 doi: 10.1109/CVPR.2019.00733 – ident: ref_7 doi: 10.1109/ICCV.2015.416 – volume: 67 start-page: 275 year: 2021 ident: ref_45 article-title: Driver Gaze Zone Estimation via Head Pose Fusion Assisted Supervision and Eye Region Weighted Encoding publication-title: IEEE Trans. Consum. Electron. doi: 10.1109/TCE.2021.3127006 – ident: ref_9 doi: 10.1109/ITSC.2012.6338678 – ident: ref_11 doi: 10.1109/TITS.2021.3075350 – volume: 21 start-page: 346 year: 2019 ident: ref_44 article-title: Driver monitoring using sparse representation with part-based temporal face descriptors publication-title: IEEE Trans. Intell. Transp. Syst. doi: 10.1109/TITS.2019.2892155 – ident: ref_26 doi: 10.1109/ICCV.2013.184 – volume: 17 start-page: 2739 year: 2016 ident: ref_38 article-title: Driver-gaze zone estimation using Bayesian filtering and Gaussian processes publication-title: IEEE Trans. Intell. Transp. Syst. doi: 10.1109/TITS.2016.2526050 – ident: ref_17 – ident: ref_4 doi: 10.1109/CVPR.2011.5995458 – ident: ref_49 doi: 10.1109/AFGR.2008.4813369 – volume: 15 start-page: 1855 year: 2014 ident: ref_8 article-title: Driver monitoring based on low-cost 3-D sensors publication-title: IEEE Trans. Intell. Transp. Syst. doi: 10.1109/TITS.2014.2332613 – volume: 37 start-page: 1731 year: 2021 ident: ref_5 article-title: RGB-D-based gaze point estimation via multi-column CNNs and facial landmarks global optimization publication-title: Vis. Comput. doi: 10.1007/s00371-020-01934-1 – ident: ref_30 – ident: ref_3 – ident: ref_6 doi: 10.3390/s19061287 – ident: ref_19 doi: 10.1109/FG.2019.8756592 – ident: ref_15 doi: 10.1109/CVPRW.2017.155 – ident: ref_42 doi: 10.3390/electronics10121480 – volume: 16 start-page: 3017 year: 2015 ident: ref_1 article-title: Driver behavior analysis for safe driving: A survey publication-title: IEEE Trans. Intell. Transp. Syst. doi: 10.1109/TITS.2015.2462084 – ident: ref_40 – ident: ref_43 doi: 10.3390/s21186262 – ident: ref_21 – ident: ref_50 doi: 10.1109/FG.2018.00019 – volume: 139 start-page: 41 year: 2018 ident: ref_37 article-title: Learning a gaze estimator with neighbor selection from large-scale synthetic eye images publication-title: Knowl.-Based Syst. doi: 10.1016/j.knosys.2017.10.010 – ident: ref_34 doi: 10.37247/PAELEC.1.22.12 – ident: ref_33 doi: 10.1109/ICIBA50161.2020.9276968 – ident: ref_14 doi: 10.1109/CVPRW.2012.6239236 – ident: ref_25 – ident: ref_16 doi: 10.1109/CVPR.2017.583 – volume: 235 start-page: 107630 year: 2022 ident: ref_39 article-title: Self-calibrated driver gaze estimation via gaze pattern learning publication-title: Knowl.-Based Syst. doi: 10.1016/j.knosys.2021.107630 – ident: ref_35 doi: 10.3390/s19245540 – ident: ref_12 – ident: ref_13 doi: 10.3390/s18051641 – volume: 110 start-page: 293 year: 2016 ident: ref_36 article-title: Appearance-based gaze estimation using deep features and random forest regression publication-title: Knowl.-Based Syst. doi: 10.1016/j.knosys.2016.07.038 – volume: 16 start-page: 2014 year: 2015 ident: ref_48 article-title: Driver gaze tracking and eyes off the road detection system publication-title: IEEE Trans. Intell. Transp. Syst. doi: 10.1109/TITS.2015.2396031 – ident: ref_2 doi: 10.1109/ICETECH.2016.7569378 – ident: ref_27 doi: 10.1109/ICRA.2018.8461063 – ident: ref_46 doi: 10.1109/ROBOT.2009.5152712 |
SSID | ssj0023338 |
Score | 2.4106061 |
Snippet | Head pose and eye gaze are vital clues for analysing a driver’s visual attention. Previous approaches achieve promising results from point clouds in... Head pose and eye gaze are vital clues for analysing a driver's visual attention. Previous approaches achieve promising results from point clouds in... |
SourceID | doaj pubmedcentral proquest pubmed crossref |
SourceType | Open Website Open Access Repository Aggregation Database Index Database Enrichment Source |
StartPage | 3154 |
SubjectTerms | Automobile Driving Cameras Deep learning driving environment Face Fixation, Ocular gaze zone Head Head Movements head pose ICP Methods Optimization algorithms point cloud Registration |
SummonAdditionalLinks | – databaseName: DOAJ Directory of Open Access Journals dbid: DOA link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV3LattAFB2KV-mipG-naZiWLroRlTSjxyybhzGFllISMN2IeVy1BkcOsb3KJr-R38uX9MxIFnYwdJOd0VyGse69uudId84w9qlwJRmZW785xkZSl17yNtWRTckpB0hugs7s9x_5-EJ-m2STjaO-fE9YKw_c3jiQ88TWmlCoyMkA8Is6UYaUVLVIbdD5RM1bk6mOagkwr1ZHSIDUf1mksBFJJreqTxDp34UsHzZIblSc0T571kFF_rVd4nP2hJoX7OmGgOBLdnN67fsq7m_vFnwMb_Gf8wVx3Tjum3j473lD_AxJ3O5P5McoWY7jR9h2G4Xhc7q8mnnAyX_Rn15FN8zRWo18-xYmnjZLfjKbrxwfrfwrtlfsYnR2fjKOuuMUIgsOvIxMaYWpZew0anRWg-dYl9bKKpJWJZKyjARpqsuESOlc4FLsYpNpJGmpYyles0GDhb1lnHIC7EGdKwojgRiMTSXJWGiQQ5VTMmSf17e5sp3WuD_yYlaBc3iPVL1Hhuxjb3rVCmzsMjr2vuoNvCZ2uIBIqbpIqf4XKUN2uPZ01SXqogI-EYlXpU-H7EM_jBTz3010Q_NVsPGqjFkJmzdtYPQrAVxTcSkxebEVMltL3R5ppn-DjDeAWgY4fPAY_-0d20v9voxY4rF3yAbL6xW9B1pamqOQGP8Aj6oUQw priority: 102 providerName: Directory of Open Access Journals – databaseName: Health & Medical Collection dbid: 7X7 link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1Lb9QwELZKe4EDgvLoQqkM4sAlamI7m_hU9bVaVQIh1EorLpEfk7bSNlk2uycu_Rv9e_wSZpxs2kUVtygeWVbG4_k-Z_yZsc-Zz8GqoaPDMS5SJifJW2EiJ8Brj5DcBp3Zr9-G4wt1Nkkn3YZb05VVrtbEsFD72tEe-T4mEpmQfLg4mP2K6NYo-rvaXaHxhG2RdBmVdGWTe8IlkX-1akISqf1-IwTy9yRVazkoSPU_hi__LZN8kHdGL9jzDjDyw9bDL9kGVNvs2QMZwVfs98mcqiv-3N41fIw-49_rBripPKdSHv6zroCfYii3pxT5ESYuz_EhHL6NQvM53MymBDv5D7jstXRDH63ViIq4sOPrasGPp_XS89GSNtpes4vR6fnxOOouVYgcMuFFZHMnbalibzBTpyWyHedFqZ0G5XSiIE1BgoEyTwC0wW-uIPaxTQ2Gam5iJd-wzQoHtsM4DAHBD2a7LLMKcYN1QoGKpUGKqIeQDNiX1WcuXKc4ThdfTAtkHuSRovfIgH3qTWetzMZjRkfkq96AlLHDi3p-WXSBViCgcaUBBDbgVSCEWZloC1rpUgonB2x35emiC9emuJ9cA_axb8ZAo78npoJ6GWxImzHN0eZtOzH6kSBo03GusPNsbcqsDXW9pbq-CmLeCNdSBMXv_j-s9-ypoHMXscJlbZdtLuZL-IBoaGH3wpT_C54hDAA priority: 102 providerName: ProQuest |
Title | Driver’s Head Pose and Gaze Zone Estimation Based on Multi-Zone Templates Registration and Multi-Frame Point Cloud Fusion |
URI | https://www.ncbi.nlm.nih.gov/pubmed/35590843 https://www.proquest.com/docview/2663126172 https://www.proquest.com/docview/2667790582 https://pubmed.ncbi.nlm.nih.gov/PMC9105416 https://doaj.org/article/201cfae820ed4042877f19be949f32c3 |
Volume | 22 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwdV1Za9wwEB5yQEkfSu-6TRe1FNoXN7YsH3oopZuuuxQSQsjC0hcjy-M0sLGTPaD59x3JB3HZvhhjDUJoNJ7v0_EJ4ENcJJiLSJvDMdoVKjGSt1y5mmMhC4LkudWZPTmNpjPxcx7Od6C7Y7PtwNVWamfuk5otF5__3N59pYD_YhgnUfajFefEywkL7MI-JaTYxOeJ6BcTeEA0rBEVGpofwAPKttJLRDDISla8fxvi_Hfj5L1MlD6GRy2EZN8anz-BHayewsN7woLP4O770uy3-LhiU_IhO6tXyFRVMLO1h_2qK2QTCu3m1CIbUyIrGL3Yw7iuLb7A65uFgaHsHC97bV1bR2OVmk1dVPFVtWbHi3pTsHRjJt6ewyydXBxP3faSBVcTM167eaKDvBReoShzhyWxH13wUmqJQktfYBhigArLxEeUKgrok1d4eagodBPlieAF7FXUsFfAMEICQ5T94jgXhCNyzQUKL1BEGWWEvgOfuk7OdKtAbi7CWGTERIxrst41DrzvTW8a2Y1tRmPjqd7AKGXbD_XyMmsDLyOAo0uFBHSwEJYgxqUvc5RClgHXgQOHnZ-zbvRlhFoC32jVcwfe9cUUeGY1RVVYb6yN0WoME7J52QyLviXdsHIgHgyYQVOHJdXVbyvuTfAtJJD8-r91voEDbo5geIL-cIewt15u8C0Bo3U-gt14HtMzSX-MYH88OT07H9lJhpENiL_3Mg_S |
linkProvider | Scholars Portal |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Lb9QwELZKOQAHxJstBQwCiUtUx3Y28QEh-lht6UMIbaUVl9SxJ6XSNlk2u0KIC3-DP8GP4pcwdh7toopbb1E8cqzMeOabxPMNIa9im0Am-8YVx5hA6sRR3nIdGA5WWYTkmeeZPTjsD4_kh3E0XiG_21oYd6yy9YneUdvSuG_kGxhIROjow_m76dfAdY1yf1fbFhq1WezB92-YslVvd7dRv685H-yMtoZB01UgMJgKzoMsMSLLJbMaQ1WUI9w3lufKKJBGhRKiCARoyJMQQGl8qARmWRZptNVEMylw3mvkOgZe5nZUPD5P8ATmezV7kRCKbVScMyXCSC7FPN8a4DI8---xzAtxbnCH3G4AKn1fW9RdsgLFPXLrAm3hffJje-ZOc_z5-auiQ7QR-rGsgOrCUnd0iH4uC6A76Drqqki6iYHSUrzwxb6BHx7B2XTiYC79BCcdd6-fo5YauENjOPFpMadbk3Jh6WDhPuw9IEdX8rofktUCF_aYUOgDgi2MrnGcScQpmeESJBMaU1LVh7BH3rSvOTUNw7lrtDFJMdNxGkk7jfTIy050WtN6XCa06XTVCTgmbn-jnJ2kzcZOEUCZXAMCKbDSJ6BxHqoMlFS54Eb0yHqr6bRxD1V6bsw98qIbxo3t_tboAsqFl3FckFGCMo9qw-hWgiBRsUTi5PGSySwtdXmkOP3iycMRHkYIwtf-v6zn5MZwdLCf7u8e7j0hN7mr-WASXeo6WZ3PFvAUkdg8e-bNn5Ljq95vfwECdknf |
linkToPdf | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV3NbtQwELZKkRAcEP8sFDAIJC7RJraziQ8I0W5XWwpVhVppxSU49qRUWpJlf4QQF16DV-nj9EmYcbJpF1XcelutR9Yonp9vkvE3jL1MXAq56lm6HGMDZVKivBUmsAKcdgjJc88z-3GvNzxU70fxaI2dLO_CUFvlMib6QO0qS-_Iu5hIZET04aJbNG0R-_3B28n3gCZI0ZfW5TiN2kR24ecPLN9mb3b6eNavhBhsH2wNg2bCQGCxLJwHeWplXqjQGUxbcYHQ3zpRaKtBWR0piGOQYKBIIwBtUAEFoQvz2KDdpiZUEve9wq4mMo7Ix5LRWbEnsfarmYyk1GF3JkSoZRSrlfznxwRchG3_bdE8l_MGt9jNBqzyd7V13WZrUN5hN85RGN5lv_pT6uw4_f1nxodoL3y_mgE3pePURsQ_VyXwbQwj9Q1JvolJ03H84S_-Bn75AL5NxgR5-Sc4anl8_R611IAayHDj43LOt8bVwvHBgl7y3WOHl_K477P1EhV7yDj0AIEXZtokyRViltwKBSqUBstT3YOow14vH3NmG7ZzGroxzrDqoRPJ2hPpsBet6KSm-LhIaJPOqhUgVm7_RzU9yhonzxBM2cIAgipwyhejSRHpHLTShRRWdtjG8qSzJlTMsjPD7rDn7TI6OX25MSVUCy9DvJBxijIPasNoNUHAqMNU4ebJismsqLq6Uh5_9UTiCBVjBOSP_q_WM3YNPS37sLO3-5hdF3T9I1QYXTfY-ny6gCcIyub5U2_9nH25bHf7Cx49ThU |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Driver%27s+Head+Pose+and+Gaze+Zone+Estimation+Based+on+Multi-Zone+Templates+Registration+and+Multi-Frame+Point+Cloud+Fusion&rft.jtitle=Sensors+%28Basel%2C+Switzerland%29&rft.au=Wang%2C+Yafei&rft.au=Yuan%2C+Guoliang&rft.au=Fu%2C+Xianping&rft.date=2022-04-20&rft.eissn=1424-8220&rft.volume=22&rft.issue=9&rft_id=info:doi/10.3390%2Fs22093154&rft_id=info%3Apmid%2F35590843&rft.externalDocID=35590843 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1424-8220&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1424-8220&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1424-8220&client=summon |