Detection of eye contact with deep neural networks is as accurate as human experts
Eye contact is among the most primary means of social communication used by humans. Quantification of eye contact is valuable as a part of the analysis of social roles and communication skills, and for clinical screening. Estimating a subject’s looking direction is a challenging task, but eye contac...
Saved in:
Published in | Nature communications Vol. 11; no. 1; pp. 6386 - 10 |
---|---|
Main Authors | , , , , , , , , , , |
Format | Journal Article |
Language | English |
Published |
London
Nature Publishing Group UK
14.12.2020
Nature Publishing Group Nature Portfolio |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | Eye contact is among the most primary means of social communication used by humans. Quantification of eye contact is valuable as a part of the analysis of social roles and communication skills, and for clinical screening. Estimating a subject’s looking direction is a challenging task, but eye contact can be effectively captured by a wearable point-of-view camera which provides a unique viewpoint. While moments of eye contact from this viewpoint can be hand-coded, such a process tends to be laborious and subjective. In this work, we develop a deep neural network model to automatically detect eye contact in egocentric video. It is the first to achieve accuracy equivalent to that of human experts. We train a deep convolutional network using a dataset of 4,339,879 annotated images, consisting of 103 subjects with diverse demographic backgrounds. 57 subjects have a diagnosis of Autism Spectrum Disorder. The network achieves overall precision of 0.936 and recall of 0.943 on 18 validation subjects, and its performance is on par with 10 trained human coders with a mean precision 0.918 and recall 0.946. Our method will be instrumental in gaze behavior analysis by serving as a scalable, objective, and accessible tool for clinicians and researchers.
Eye contact is a key social behavior and its measurement could facilitate the diagnosis and treatment of autism. Here the authors show that a deep neural network model can detect eye contact as accurately has human experts. |
---|---|
AbstractList | Eye contact is among the most primary means of social communication used by humans. Quantification of eye contact is valuable as a part of the analysis of social roles and communication skills, and for clinical screening. Estimating a subject’s looking direction is a challenging task, but eye contact can be effectively captured by a wearable point-of-view camera which provides a unique viewpoint. While moments of eye contact from this viewpoint can be hand-coded, such a process tends to be laborious and subjective. In this work, we develop a deep neural network model to automatically detect eye contact in egocentric video. It is the first to achieve accuracy equivalent to that of human experts. We train a deep convolutional network using a dataset of 4,339,879 annotated images, consisting of 103 subjects with diverse demographic backgrounds. 57 subjects have a diagnosis of Autism Spectrum Disorder. The network achieves overall precision of 0.936 and recall of 0.943 on 18 validation subjects, and its performance is on par with 10 trained human coders with a mean precision 0.918 and recall 0.946. Our method will be instrumental in gaze behavior analysis by serving as a scalable, objective, and accessible tool for clinicians and researchers.
Eye contact is a key social behavior and its measurement could facilitate the diagnosis and treatment of autism. Here the authors show that a deep neural network model can detect eye contact as accurately has human experts. Eye contact is among the most primary means of social communication used by humans. Quantification of eye contact is valuable as a part of the analysis of social roles and communication skills, and for clinical screening. Estimating a subject's looking direction is a challenging task, but eye contact can be effectively captured by a wearable point-of-view camera which provides a unique viewpoint. While moments of eye contact from this viewpoint can be hand-coded, such a process tends to be laborious and subjective. In this work, we develop a deep neural network model to automatically detect eye contact in egocentric video. It is the first to achieve accuracy equivalent to that of human experts. We train a deep convolutional network using a dataset of 4,339,879 annotated images, consisting of 103 subjects with diverse demographic backgrounds. 57 subjects have a diagnosis of Autism Spectrum Disorder. The network achieves overall precision of 0.936 and recall of 0.943 on 18 validation subjects, and its performance is on par with 10 trained human coders with a mean precision 0.918 and recall 0.946. Our method will be instrumental in gaze behavior analysis by serving as a scalable, objective, and accessible tool for clinicians and researchers.Eye contact is among the most primary means of social communication used by humans. Quantification of eye contact is valuable as a part of the analysis of social roles and communication skills, and for clinical screening. Estimating a subject's looking direction is a challenging task, but eye contact can be effectively captured by a wearable point-of-view camera which provides a unique viewpoint. While moments of eye contact from this viewpoint can be hand-coded, such a process tends to be laborious and subjective. In this work, we develop a deep neural network model to automatically detect eye contact in egocentric video. It is the first to achieve accuracy equivalent to that of human experts. We train a deep convolutional network using a dataset of 4,339,879 annotated images, consisting of 103 subjects with diverse demographic backgrounds. 57 subjects have a diagnosis of Autism Spectrum Disorder. The network achieves overall precision of 0.936 and recall of 0.943 on 18 validation subjects, and its performance is on par with 10 trained human coders with a mean precision 0.918 and recall 0.946. Our method will be instrumental in gaze behavior analysis by serving as a scalable, objective, and accessible tool for clinicians and researchers. Eye contact is among the most primary means of social communication used by humans. Quantification of eye contact is valuable as a part of the analysis of social roles and communication skills, and for clinical screening. Estimating a subject’s looking direction is a challenging task, but eye contact can be effectively captured by a wearable point-of-view camera which provides a unique viewpoint. While moments of eye contact from this viewpoint can be hand-coded, such a process tends to be laborious and subjective. In this work, we develop a deep neural network model to automatically detect eye contact in egocentric video. It is the first to achieve accuracy equivalent to that of human experts. We train a deep convolutional network using a dataset of 4,339,879 annotated images, consisting of 103 subjects with diverse demographic backgrounds. 57 subjects have a diagnosis of Autism Spectrum Disorder. The network achieves overall precision of 0.936 and recall of 0.943 on 18 validation subjects, and its performance is on par with 10 trained human coders with a mean precision 0.918 and recall 0.946. Our method will be instrumental in gaze behavior analysis by serving as a scalable, objective, and accessible tool for clinicians and researchers.Eye contact is a key social behavior and its measurement could facilitate the diagnosis and treatment of autism. Here the authors show that a deep neural network model can detect eye contact as accurately has human experts. Eye contact is among the most primary means of social communication used by humans. Quantification of eye contact is valuable as a part of the analysis of social roles and communication skills, and for clinical screening. Estimating a subject’s looking direction is a challenging task, but eye contact can be effectively captured by a wearable point-of-view camera which provides a unique viewpoint. While moments of eye contact from this viewpoint can be hand-coded, such a process tends to be laborious and subjective. In this work, we develop a deep neural network model to automatically detect eye contact in egocentric video. It is the first to achieve accuracy equivalent to that of human experts. We train a deep convolutional network using a dataset of 4,339,879 annotated images, consisting of 103 subjects with diverse demographic backgrounds. 57 subjects have a diagnosis of Autism Spectrum Disorder. The network achieves overall precision of 0.936 and recall of 0.943 on 18 validation subjects, and its performance is on par with 10 trained human coders with a mean precision 0.918 and recall 0.946. Our method will be instrumental in gaze behavior analysis by serving as a scalable, objective, and accessible tool for clinicians and researchers. Eye contact is a key social behavior and its measurement could facilitate the diagnosis and treatment of autism. Here the authors show that a deep neural network model can detect eye contact as accurately has human experts. |
ArticleNumber | 6386 |
Author | Southerland, Audrey Rozga, Agata Clark-Whitney, Elysha Rehg, James M. Ajodan, Eliana L. Chong, Eunji Lord, Catherine Jones, Rebecca M. Miller, Chanel Stubbs, Elizabeth Silverman, Melanie R. |
Author_xml | – sequence: 1 givenname: Eunji orcidid: 0000-0002-5235-202X surname: Chong fullname: Chong, Eunji email: eunjichong@gatech.edu organization: School of Interactive Computing, Georgia Institute of Technology – sequence: 2 givenname: Elysha surname: Clark-Whitney fullname: Clark-Whitney, Elysha organization: Center for Autism and the Developing Brain, Weill Cornell Medicine – sequence: 3 givenname: Audrey surname: Southerland fullname: Southerland, Audrey organization: School of Interactive Computing, Georgia Institute of Technology – sequence: 4 givenname: Elizabeth surname: Stubbs fullname: Stubbs, Elizabeth organization: School of Interactive Computing, Georgia Institute of Technology – sequence: 5 givenname: Chanel surname: Miller fullname: Miller, Chanel organization: School of Interactive Computing, Georgia Institute of Technology – sequence: 6 givenname: Eliana L. surname: Ajodan fullname: Ajodan, Eliana L. organization: Center for Autism and the Developing Brain, Weill Cornell Medicine – sequence: 7 givenname: Melanie R. surname: Silverman fullname: Silverman, Melanie R. organization: Center for Autism and the Developing Brain, Weill Cornell Medicine – sequence: 8 givenname: Catherine surname: Lord fullname: Lord, Catherine organization: School of Medicine, University of California – sequence: 9 givenname: Agata orcidid: 0000-0002-5558-9786 surname: Rozga fullname: Rozga, Agata organization: School of Interactive Computing, Georgia Institute of Technology – sequence: 10 givenname: Rebecca M. surname: Jones fullname: Jones, Rebecca M. organization: Center for Autism and the Developing Brain, Weill Cornell Medicine – sequence: 11 givenname: James M. surname: Rehg fullname: Rehg, James M. organization: School of Interactive Computing, Georgia Institute of Technology |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/33318484$$D View this record in MEDLINE/PubMed |
BookMark | eNp9Uk1v1DAQtVARLUv_AAcUiQuXgD_j5IKEylelSkgIzpbjjHe9ZO3Fduj23-M0pbQ91LLk0fi9N2_seY6OfPCA0EuC3xLM2neJE97IGlNck04SWh-eoBOKOamJpOzoTnyMTlPa4rJYR1rOn6FjxliJWn6Cvn-EDCa74KtgK7iCygSftcnVpcubagDYVx6mqMdy5MsQf6XKpUqXbUxJZ5jjzbTTvoLDHmJOL9BTq8cEpzfnCv38_OnH2df64tuX87MPF7URHOe6o0xKi5npikcji6EBYysxMG44ZcNghQY7dMwYYUlPRG86S7S0DbR9Y4Ct0PmiOwS9VfvodjpeqaCduk6EuFY6ZmdGULbUEmKuQgVvrWh7K6QdDB2o7BiWRev9orWf-h0MBnwuLd8TvX_j3Uatwx8lJWuEZEXgzY1ADL8nSFntXDIwjtpDmJKiXGLaYsG6An39ALoNU_TlqWYUIx1pSv8r9Oquo1sr_76uAOgCMDGkFMHeQghW84ioZURUGRF1PSLqUEjtA5JxWc_fX7py4-NUtlBTqePXEP_bfoT1Fy9M0EY |
CitedBy_id | crossref_primary_10_2196_33771 crossref_primary_10_3389_frobt_2022_770165 crossref_primary_10_1016_j_engappai_2022_105743 crossref_primary_10_1007_s11263_024_02095_7 crossref_primary_10_2196_35406 crossref_primary_10_1109_TBME_2022_3223736 crossref_primary_10_1002_aur_3087 crossref_primary_10_1007_s10803_023_05990_z crossref_primary_10_3390_s23249619 crossref_primary_10_1007_s11548_023_02901_6 crossref_primary_10_1038_s41467_023_38901_y crossref_primary_10_3390_a17120560 crossref_primary_10_1038_s41398_025_03233_6 crossref_primary_10_3389_frobt_2021_650906 |
Cites_doi | 10.1016/j.cub.2017.05.044 10.1371/journal.pone.0139346 10.1145/3131902 10.1073/pnas.1806905115 10.1007/s10803-016-2782-9 10.1177/001316446002000104 10.1002/ajmg.1320380223 10.1111/j.1467-8624.2011.01670.x 10.1016/j.patrec.2014.10.002 10.1016/j.neucom.2017.05.013 10.1177/08830738060210021901 10.1111/j.1469-7610.1986.tb00190.x 10.1037/0012-1649.16.5.454 10.1023/A:1010738829569 10.1038/s41591-018-0268-3 10.1038/nature21056 10.5898/JHRI.6.1.Admoni 10.1001/jama.2016.17216 10.1109/TKDE.2009.191 10.1007/s10803-012-1719-1 10.1007/s10803-016-3002-3 10.1016/j.media.2017.07.005 10.1017/S0954579419000427 10.1111/cdev.12730 10.1016/j.media.2016.07.007 10.1016/j.neuropsychologia.2008.05.003 10.1109/TPAMI.2017.2778103 10.1016/j.biopsych.2012.11.022 10.1109/TAFFC.2015.2422702 10.1038/s41598-018-22726-7 10.1037/0033-2909.100.1.78 10.1080/13506289508401722 10.1073/pnas.0502205102 10.1111/j.1467-8721.2007.00518.x 10.1016/j.cviu.2015.03.015 10.1177/1362361318766247 10.1007/BF01068419 10.1007/s10803-016-2981-4 10.1109/TAFFC.2018.2868196 10.31234/osf.io/syp5a 10.1037/t11529-000 10.1145/3172944.3173010 10.1007/978-3-319-32552-1_72 10.1109/CVPR.2016.239 10.1155/2014/935686 10.2307/2786027 10.1109/CVPRW.2018.00281 10.1145/2818346.2820760 10.1109/CVPR.2017.167 10.1145/2578153.2578190 10.1109/CVPR.2018.00230 10.1145/2370216.2370264 10.1109/CVPR.2016.90 10.1007/978-3-030-18114-7_18 10.1109/FG.2015.7163127 |
ContentType | Journal Article |
Copyright | The Author(s) 2020 The Author(s) 2020. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. |
Copyright_xml | – notice: The Author(s) 2020 – notice: The Author(s) 2020. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. |
DBID | C6C AAYXX CITATION CGR CUY CVF ECM EIF NPM 3V. 7QL 7QP 7QR 7SN 7SS 7ST 7T5 7T7 7TM 7TO 7X7 7XB 88E 8AO 8FD 8FE 8FG 8FH 8FI 8FJ 8FK ABUWG AEUYN AFKRA ARAPS AZQEC BBNVY BENPR BGLVJ BHPHI C1K CCPQU DWQXO FR3 FYUFA GHDGH GNUQQ H94 HCIFZ K9. LK8 M0S M1P M7P P5Z P62 P64 PHGZM PHGZT PIMPY PJZUB PKEHL PPXIY PQEST PQGLB PQQKQ PQUKI PRINS RC3 SOI 7X8 5PM DOA |
DOI | 10.1038/s41467-020-19712-x |
DatabaseName | Springer Nature OA Free Journals CrossRef Medline MEDLINE MEDLINE (Ovid) MEDLINE MEDLINE PubMed ProQuest Central (Corporate) Bacteriology Abstracts (Microbiology B) Calcium & Calcified Tissue Abstracts Chemoreception Abstracts Ecology Abstracts Entomology Abstracts (Full archive) Environment Abstracts Immunology Abstracts Industrial and Applied Microbiology Abstracts (Microbiology A) Nucleic Acids Abstracts Oncogenes and Growth Factors Abstracts Health & Medical Collection ProQuest Central (purchase pre-March 2016) Medical Database (Alumni Edition) ProQuest Pharma Collection Technology Research Database ProQuest SciTech Collection ProQuest Technology Collection ProQuest Natural Science Collection Hospital Premium Collection Hospital Premium Collection (Alumni Edition) ProQuest Central (Alumni) (purchase pre-March 2016) ProQuest Central (Alumni) ProQuest One Sustainability ProQuest Central UK/Ireland Health Research Premium Collection ProQuest Central Essentials Biological Science Collection ProQuest Central Technology Collection Natural Science Collection Environmental Sciences and Pollution Management ProQuest One Community College ProQuest Central Korea Engineering Research Database Health Research Premium Collection Health Research Premium Collection (Alumni) ProQuest Central Student AIDS and Cancer Research Abstracts SciTech Premium Collection ProQuest Health & Medical Complete (Alumni) ProQuest Biological Science Collection Health & Medical Collection (Alumni Edition) Medical Database Biological Science Database Advanced Technologies & Aerospace Database ProQuest Advanced Technologies & Aerospace Collection Biotechnology and BioEngineering Abstracts ProQuest Central Premium ProQuest One Academic (New) ProQuest Publicly Available Content ProQuest Health & Medical Research Collection ProQuest One Academic Middle East (New) ProQuest One Health & Nursing ProQuest One Academic Eastern Edition (DO NOT USE) ProQuest One Applied & Life Sciences ProQuest One Academic ProQuest One Academic UKI Edition ProQuest Central China Genetics Abstracts Environment Abstracts MEDLINE - Academic PubMed Central (Full Participant titles) DOAJ Directory of Open Access Journals |
DatabaseTitle | CrossRef MEDLINE Medline Complete MEDLINE with Full Text PubMed MEDLINE (Ovid) Publicly Available Content Database ProQuest Central Student Oncogenes and Growth Factors Abstracts ProQuest Advanced Technologies & Aerospace Collection ProQuest Central Essentials Nucleic Acids Abstracts SciTech Premium Collection ProQuest Central China Environmental Sciences and Pollution Management ProQuest One Applied & Life Sciences ProQuest One Sustainability Health Research Premium Collection Natural Science Collection Health & Medical Research Collection Biological Science Collection Chemoreception Abstracts Industrial and Applied Microbiology Abstracts (Microbiology A) ProQuest Central (New) ProQuest Medical Library (Alumni) Advanced Technologies & Aerospace Collection ProQuest Biological Science Collection ProQuest One Academic Eastern Edition ProQuest Hospital Collection ProQuest Technology Collection Health Research Premium Collection (Alumni) Biological Science Database Ecology Abstracts ProQuest Hospital Collection (Alumni) Biotechnology and BioEngineering Abstracts Entomology Abstracts ProQuest Health & Medical Complete ProQuest One Academic UKI Edition Engineering Research Database ProQuest One Academic Calcium & Calcified Tissue Abstracts ProQuest One Academic (New) Technology Collection Technology Research Database ProQuest One Academic Middle East (New) ProQuest Health & Medical Complete (Alumni) ProQuest Central (Alumni Edition) ProQuest One Community College ProQuest One Health & Nursing ProQuest Natural Science Collection ProQuest Pharma Collection ProQuest Central ProQuest Health & Medical Research Collection Genetics Abstracts Health and Medicine Complete (Alumni Edition) ProQuest Central Korea Bacteriology Abstracts (Microbiology B) AIDS and Cancer Research Abstracts ProQuest SciTech Collection Advanced Technologies & Aerospace Database ProQuest Medical Library Immunology Abstracts Environment Abstracts ProQuest Central (Alumni) MEDLINE - Academic |
DatabaseTitleList | MEDLINE - Academic Publicly Available Content Database CrossRef MEDLINE |
Database_xml | – sequence: 1 dbid: C6C name: Springer Nature OA Free Journals url: http://www.springeropen.com/ sourceTypes: Publisher – sequence: 2 dbid: DOA name: DOAJ Directory of Open Access Journals url: https://www.doaj.org/ sourceTypes: Open Website – sequence: 3 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 4 dbid: EIF name: MEDLINE url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search sourceTypes: Index Database – sequence: 5 dbid: 8FG name: ProQuest Technology Collection url: https://search.proquest.com/technologycollection1 sourceTypes: Aggregation Database |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Biology |
EISSN | 2041-1723 |
EndPage | 10 |
ExternalDocumentID | oai_doaj_org_article_f237553c732548f58bf57fdc2d279307 PMC7736573 33318484 10_1038_s41467_020_19712_x |
Genre | Research Support, Non-U.S. Gov't Journal Article Research Support, N.I.H., Extramural |
GrantInformation_xml | – fundername: U.S. Department of Health & Human Services | National Institutes of Health (NIH) grantid: R01 MH114999 funderid: https://doi.org/10.13039/100000002 – fundername: Simons Foundation grantid: 336363; 383667 funderid: https://doi.org/10.13039/100000893 – fundername: NIMH NIH HHS grantid: R01 MH114999 – fundername: ; grantid: 336363; 383667 – fundername: ; grantid: R01 MH114999 |
GroupedDBID | --- 0R~ 39C 3V. 53G 5VS 70F 7X7 88E 8AO 8FE 8FG 8FH 8FI 8FJ AAHBH AAJSJ ABUWG ACGFO ACGFS ACIWK ACMJI ACPRK ACSMW ADBBV ADFRT ADMLS ADRAZ AENEX AEUYN AFKRA AFRAH AHMBA AJTQC ALIPV ALMA_UNASSIGNED_HOLDINGS AMTXH AOIJS ARAPS ASPBG AVWKF AZFZN BBNVY BCNDV BENPR BGLVJ BHPHI BPHCQ BVXVI C6C CCPQU DIK EBLON EBS EE. EMOBN F5P FEDTE FYUFA GROUPED_DOAJ HCIFZ HMCUK HVGLF HYE HZ~ KQ8 LK8 M1P M48 M7P M~E NAO O9- OK1 P2P P62 PIMPY PQQKQ PROAC PSQYO RNS RNT RNTTT RPM SNYQT SV3 TSG UKHRP AASML AAYXX CITATION PHGZM PHGZT CGR CUY CVF ECM EIF NPM 7QL 7QP 7QR 7SN 7SS 7ST 7T5 7T7 7TM 7TO 7XB 8FD 8FK AARCD AZQEC C1K DWQXO FR3 GNUQQ H94 K9. P64 PJZUB PKEHL PPXIY PQEST PQGLB PQUKI PRINS RC3 SOI 7X8 5PM PUEGO |
ID | FETCH-LOGICAL-c540t-92377f03c9723c7318d00f70e34c423ddf5aefd93cc5f1b15bc9f1a7f6e8b6ce3 |
IEDL.DBID | M48 |
ISSN | 2041-1723 |
IngestDate | Wed Aug 27 01:07:29 EDT 2025 Thu Aug 21 18:30:33 EDT 2025 Thu Jul 10 20:49:17 EDT 2025 Wed Aug 13 04:32:50 EDT 2025 Wed Feb 19 02:09:48 EST 2025 Tue Jul 01 04:17:11 EDT 2025 Thu Apr 24 22:58:56 EDT 2025 Fri Feb 21 02:40:06 EST 2025 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 1 |
Language | English |
License | Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/. |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c540t-92377f03c9723c7318d00f70e34c423ddf5aefd93cc5f1b15bc9f1a7f6e8b6ce3 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
ORCID | 0000-0002-5235-202X 0000-0002-5558-9786 |
OpenAccessLink | https://doaj.org/article/f237553c732548f58bf57fdc2d279307 |
PMID | 33318484 |
PQID | 2473191642 |
PQPubID | 546298 |
PageCount | 10 |
ParticipantIDs | doaj_primary_oai_doaj_org_article_f237553c732548f58bf57fdc2d279307 pubmedcentral_primary_oai_pubmedcentral_nih_gov_7736573 proquest_miscellaneous_2470280539 proquest_journals_2473191642 pubmed_primary_33318484 crossref_primary_10_1038_s41467_020_19712_x crossref_citationtrail_10_1038_s41467_020_19712_x springer_journals_10_1038_s41467_020_19712_x |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2020-12-14 |
PublicationDateYYYYMMDD | 2020-12-14 |
PublicationDate_xml | – month: 12 year: 2020 text: 2020-12-14 day: 14 |
PublicationDecade | 2020 |
PublicationPlace | London |
PublicationPlace_xml | – name: London – name: England |
PublicationTitle | Nature communications |
PublicationTitleAbbrev | Nat Commun |
PublicationTitleAlternate | Nat Commun |
PublicationYear | 2020 |
Publisher | Nature Publishing Group UK Nature Publishing Group Nature Portfolio |
Publisher_xml | – name: Nature Publishing Group UK – name: Nature Publishing Group – name: Nature Portfolio |
References | Hus, Gotham, Lord (CR64) 2014; 44 Holler, Kendrick (CR16) 2015; 6 Rogers, Speelman, Guidetti, Longmuir (CR18) 2018; 8 CR39 Hammal, Cohn, Messinger (CR42) 2015; 6 Campbell (CR44) 2019; 23 Reid (CR4) 2017; 27 Kleinke (CR6) 1986; 100 Chong (CR24) 2017; 1 CR32 CR31 Hannun (CR38) 2019; 25 CR30 Pan, Yang (CR28) 2009; 22 Chawarska, Macari, Shic (CR9) 2013; 74 Sheikhi, Odobez (CR52) 2015; 66 CR5 Mundy, Newell (CR7) 2007; 16 CR49 CR48 CR46 CR45 Hashemi (CR43) 2018; 1 Zhang, Sugano, Fritz, Bulling (CR47) 2019; 41 CR41 CR40 Esteva (CR35) 2017; 542 Zafeiriou, Zhang, Zhang (CR25) 2015; 138 Lindsey (CR37) 2018; 115 Kaye, Fogel (CR1) 1980; 16 Grzadzinski (CR54) 2016; 46 Ho, Foulsham, Kingstone (CR17) 2015; 10 Litjens (CR33) 2017; 42 Cohen (CR62) 1960; 20 CR15 CR59 Kooi (CR36) 2017; 35 CR58 Ezpeleta, Granero, de la Osa, Domènech (CR14) 2015; 10 CR56 CR55 Gulshan (CR34) 2016; 316 CR53 Hagerman, Amiri, Cronister (CR10) 1991; 38 CR51 Vecera, Johnson (CR2) 1995; 2 Robins, Fein, Barton, Green (CR57) 2001; 31 Wang, Gao, Tao, Yang, Li (CR26) 2018; 275 Jones (CR23) 2017; 47 Farroni (CR3) 2005; 102 Fu, Nelson, Borge, Buss, Pérez-Edgar (CR13) 2019; 31 CR29 CR27 Riby, Hancock (CR12) 2008; 46 Schuirmann (CR63) 1987; 15 CR21 Miller, Miller, Bloom, Hynd, Craggs (CR11) 2006; 21 Franchak, Kretch, Soska, Adolph (CR19) 2011; 82 CR61 CR60 Yu, Smith (CR20) 2017; 88 Admoni, Scassellati (CR50) 2017; 6 Mundy, Sigman, Ungerer, Sherman (CR8) 1986; 27 Edmunds (CR22) 2017; 47 P Mundy (19712_CR7) 2007; 16 K Chawarska (19712_CR9) 2013; 74 S Ho (19712_CR17) 2015; 10 SL Rogers (19712_CR18) 2018; 8 SR Edmunds (19712_CR22) 2017; 47 19712_CR27 DL Robins (19712_CR57) 2001; 31 RM Jones (19712_CR23) 2017; 47 CL Kleinke (19712_CR6) 1986; 100 P Mundy (19712_CR8) 1986; 27 Z Hammal (19712_CR42) 2015; 6 RJ Hagerman (19712_CR10) 1991; 38 19712_CR29 19712_CR30 R Grzadzinski (19712_CR54) 2016; 46 X Fu (19712_CR13) 2019; 31 19712_CR31 19712_CR32 SR Miller (19712_CR11) 2006; 21 S Sheikhi (19712_CR52) 2015; 66 VM Reid (19712_CR4) 2017; 27 19712_CR15 19712_CR59 K Campbell (19712_CR44) 2019; 23 19712_CR58 G Litjens (19712_CR33) 2017; 42 19712_CR60 J Cohen (19712_CR62) 1960; 20 19712_CR61 SP Vecera (19712_CR2) 1995; 2 19712_CR21 H Admoni (19712_CR50) 2017; 6 AY Hannun (19712_CR38) 2019; 25 L Ezpeleta (19712_CR14) 2015; 10 V Hus (19712_CR64) 2014; 44 JM Franchak (19712_CR19) 2011; 82 X Zhang (19712_CR47) 2019; 41 SJ Pan (19712_CR28) 2009; 22 E Chong (19712_CR24) 2017; 1 19712_CR48 19712_CR49 19712_CR46 A Esteva (19712_CR35) 2017; 542 R Lindsey (19712_CR37) 2018; 115 19712_CR51 J Holler (19712_CR16) 2015; 6 19712_CR55 S Zafeiriou (19712_CR25) 2015; 138 19712_CR56 19712_CR53 T Kooi (19712_CR36) 2017; 35 V Gulshan (19712_CR34) 2016; 316 C Yu (19712_CR20) 2017; 88 N Wang (19712_CR26) 2018; 275 19712_CR5 DM Riby (19712_CR12) 2008; 46 19712_CR39 19712_CR40 19712_CR41 J Hashemi (19712_CR43) 2018; 1 DJ Schuirmann (19712_CR63) 1987; 15 19712_CR45 K Kaye (19712_CR1) 1980; 16 T Farroni (19712_CR3) 2005; 102 |
References_xml | – ident: CR45 – volume: 10 start-page: 1 year: 2015 end-page: 18 ident: CR17 article-title: Speaking and listening with the eyes: gaze signaling during dyadic interactions publication-title: PLoS ONE – volume: 27 start-page: 1825 year: 2017 end-page: 1828 ident: CR4 article-title: The human fetus preferentially engages with face-like visual stimuli publication-title: Curr. Biol. doi: 10.1016/j.cub.2017.05.044 – ident: CR49 – volume: 10 start-page: e0139346 year: 2015 ident: CR14 article-title: Clinical characteristics of preschool children with oppositional defiant disorder and callous-unemotional traits publication-title: PLoS ONE doi: 10.1371/journal.pone.0139346 – ident: CR39 – ident: CR51 – volume: 1 start-page: 43 year: 2017 ident: CR24 article-title: Detecting gaze towards eyes in natural social interactions and its use in child assessment publication-title: Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. doi: 10.1145/3131902 – volume: 115 start-page: 11591 year: 2018 end-page: 11596 ident: CR37 article-title: Deep neural network improves fracture detection by clinicians publication-title: Proc. Natl Acad. Sci. USA doi: 10.1073/pnas.1806905115 – volume: 46 start-page: 2464 year: 2016 end-page: 2479 ident: CR54 article-title: Measuring changes in social communication behaviors: preliminary development of the brief observation of social communication change (boscc) publication-title: J. Autism Dev. Disord. doi: 10.1007/s10803-016-2782-9 – volume: 20 start-page: 37 year: 1960 end-page: 46 ident: CR62 article-title: A coefficient of agreement for nominal scales publication-title: Educ. Psychol. Meas. doi: 10.1177/001316446002000104 – volume: 38 start-page: 283 year: 1991 end-page: 287 ident: CR10 article-title: Fragile x checklist publication-title: Am. J. Med. Genet. doi: 10.1002/ajmg.1320380223 – volume: 82 start-page: 1738 year: 2011 end-page: 1750 ident: CR19 article-title: Head-mounted eye-tracking: a new method to describe infant looking publication-title: Child Dev. doi: 10.1111/j.1467-8624.2011.01670.x – ident: CR29 – ident: CR61 – ident: CR58 – volume: 66 start-page: 81 year: 2015 end-page: 90 ident: CR52 article-title: Combining dynamic head pose–gaze mapping with the robot conversational state for attention recognition in human–robot interactions publication-title: Pattern Recognit. Lett. doi: 10.1016/j.patrec.2014.10.002 – volume: 275 start-page: 50 year: 2018 end-page: 65 ident: CR26 article-title: Facial feature point detection: a comprehensive survey publication-title: Neurocomputing doi: 10.1016/j.neucom.2017.05.013 – volume: 21 start-page: 139 year: 2006 end-page: 144 ident: CR11 article-title: Right hemisphere brain morphology, attention-deficit hyperactivity disorder (adhd) subtype, and social comprehension publication-title: J. Child Neurol. doi: 10.1177/08830738060210021901 – ident: CR21 – ident: CR46 – volume: 27 start-page: 657 year: 1986 end-page: 669 ident: CR8 article-title: Defining the social deficits of autism: the contribution of non-verbal communication measures publication-title: J. Child Psychol. Psychiatry doi: 10.1111/j.1469-7610.1986.tb00190.x – ident: CR15 – volume: 16 start-page: 454 year: 1980 ident: CR1 article-title: The temporal structure of face-to-face communication between mothers and infants publication-title: Dev. Psychol. doi: 10.1037/0012-1649.16.5.454 – volume: 31 start-page: 131 year: 2001 end-page: 144 ident: CR57 article-title: The modified checklist for autism in toddlers: an initial study investigating the early detection of autism and pervasive developmental disorders publication-title: J. Autism Dev. Disord. doi: 10.1023/A:1010738829569 – volume: 25 start-page: 65 year: 2019 ident: CR38 article-title: Cardiologist-level arrhythmia detection and classification in ambulatory electrocardiograms using a deep neural network publication-title: Nat. Med. doi: 10.1038/s41591-018-0268-3 – volume: 542 start-page: 115 year: 2017 ident: CR35 article-title: Dermatologist-level classification of skin cancer with deep neural networks publication-title: Nature doi: 10.1038/nature21056 – ident: CR32 – ident: CR60 – ident: CR5 – volume: 6 start-page: 25 year: 2017 end-page: 63 ident: CR50 article-title: Social eye gaze in human-robot interaction: a review publication-title: J. Hum. Robot Interact. doi: 10.5898/JHRI.6.1.Admoni – volume: 316 start-page: 2402 year: 2016 end-page: 2410 ident: CR34 article-title: Development and validation of a deep learning algorithm for detection of diabetic retinopathy in retinal fundus photographs publication-title: JAMA doi: 10.1001/jama.2016.17216 – volume: 22 start-page: 1345 year: 2009 end-page: 1359 ident: CR28 article-title: A survey on transfer learning publication-title: IEEE Trans. Knowl. Data Eng. doi: 10.1109/TKDE.2009.191 – volume: 6 start-page: 1 year: 2015 end-page: 14 ident: CR16 article-title: Unaddressed participants’ gaze in multi-person interaction: optimizing recipiency publication-title: Front. Psychol. – volume: 44 start-page: 2400 year: 2014 end-page: 2412 ident: CR64 article-title: Standardizing ados domain scores: separating severity of social affect and restricted and repetitive behaviors publication-title: J. Autism Dev. Disord. doi: 10.1007/s10803-012-1719-1 – ident: CR53 – ident: CR30 – volume: 47 start-page: 898 year: 2017 end-page: 904 ident: CR22 article-title: Brief report: using a point-of-view camera to measure eye gaze in young children with autism spectrum disorder during naturalistic social interactions: a pilot study publication-title: J. Autism Dev. Disord. doi: 10.1007/s10803-016-3002-3 – volume: 42 start-page: 60 year: 2017 end-page: 88 ident: CR33 article-title: A survey on deep learning in medical image analysis publication-title: Med. Image Anal. doi: 10.1016/j.media.2017.07.005 – ident: CR56 – ident: CR40 – ident: CR27 – volume: 31 start-page: 971 year: 2019 end-page: 988 ident: CR13 article-title: Stationary and ambulatory attention patterns are differentially associated with early temperamental risk for socioemotional problems: preliminary evidence from a multimodal eye-tracking investigation publication-title: Dev. Psychopathol. doi: 10.1017/S0954579419000427 – volume: 88 start-page: 2060 year: 2017 end-page: 2078 ident: CR20 article-title: Hand-eye coordination predicts joint attention publication-title: Child Dev. doi: 10.1111/cdev.12730 – volume: 35 start-page: 303 year: 2017 end-page: 312 ident: CR36 article-title: Large scale deep learning for computer aided detection of mammographic lesions publication-title: Med. Image Anal. doi: 10.1016/j.media.2016.07.007 – ident: CR48 – volume: 46 start-page: 2855 year: 2008 end-page: 2860 ident: CR12 article-title: Viewing it differently: social scene perception in williams syndrome and autism publication-title: Neuropsychologia doi: 10.1016/j.neuropsychologia.2008.05.003 – volume: 41 start-page: 162 year: 2019 end-page: 175 ident: CR47 article-title: Mpiigaze: Real-world dataset and deep appearance-based gaze estimation publication-title: IEEE Trans. Pattern Anal. Mach. Intell. doi: 10.1109/TPAMI.2017.2778103 – volume: 74 start-page: 195 year: 2013 end-page: 203 ident: CR9 article-title: Decreased spontaneous attention to social scenes in 6-month-old infants later diagnosed with autism spectrum disorders publication-title: Biol. Psychiatry doi: 10.1016/j.biopsych.2012.11.022 – volume: 6 start-page: 361 year: 2015 end-page: 370 ident: CR42 article-title: Head movement dynamics during play and perturbed mother-infant interaction publication-title: IEEE Trans. Affect. Comput. doi: 10.1109/TAFFC.2015.2422702 – volume: 8 start-page: 4271 year: 2018 ident: CR18 article-title: Using dual eye tracking to uncover personal gaze patterns during social interaction publication-title: Sci. Rep. doi: 10.1038/s41598-018-22726-7 – ident: CR31 – volume: 100 start-page: 78 year: 1986 ident: CR6 article-title: Gaze and eye contact: a research review publication-title: Psychol. Bull. doi: 10.1037/0033-2909.100.1.78 – volume: 2 start-page: 59 year: 1995 end-page: 87 ident: CR2 article-title: Gaze detection and the cortical processing of faces: evidence from infants and adults publication-title: Vis. Cogn. doi: 10.1080/13506289508401722 – volume: 102 start-page: 17245 year: 2005 end-page: 17250 ident: CR3 article-title: Newborns’ preference for face-relevant stimuli: effects of contrast polarity publication-title: Proc. Natl Acad. Sci. USA doi: 10.1073/pnas.0502205102 – volume: 16 start-page: 269 year: 2007 end-page: 274 ident: CR7 article-title: Attention, joint attention, and social cognition publication-title: Curr. Dir. Psychol. Sci. doi: 10.1111/j.1467-8721.2007.00518.x – ident: CR55 – volume: 138 start-page: 1 year: 2015 end-page: 24 ident: CR25 article-title: A survey on face detection in the wild: past, present and future publication-title: Comput. Vis. Image Underst. doi: 10.1016/j.cviu.2015.03.015 – ident: CR59 – volume: 23 start-page: 619 year: 2019 end-page: 628 ident: CR44 article-title: Computer vision analysis captures atypical attention in toddlers with autism publication-title: Autism doi: 10.1177/1362361318766247 – ident: CR41 – volume: 15 start-page: 657 year: 1987 end-page: 680 ident: CR63 article-title: A comparison of the two one-sided tests procedure and the power approach for assessing the equivalence of average bioavailability publication-title: J. Pharmacokinet. Biopharm. doi: 10.1007/BF01068419 – volume: 47 start-page: 607 year: 2017 end-page: 614 ident: CR23 article-title: Increased eye contact during conversation compared to play in children with autism publication-title: J. Autism Dev. Disord. doi: 10.1007/s10803-016-2981-4 – volume: 1 start-page: 1 year: 2018 ident: CR43 article-title: Computer vision analysis for quantification of autism risk behaviors publication-title: IEEE Trans. Affect. Comput. doi: 10.1109/TAFFC.2018.2868196 – volume: 23 start-page: 619 year: 2019 ident: 19712_CR44 publication-title: Autism doi: 10.1177/1362361318766247 – volume: 100 start-page: 78 year: 1986 ident: 19712_CR6 publication-title: Psychol. Bull. doi: 10.1037/0033-2909.100.1.78 – ident: 19712_CR29 doi: 10.31234/osf.io/syp5a – volume: 27 start-page: 657 year: 1986 ident: 19712_CR8 publication-title: J. Child Psychol. Psychiatry doi: 10.1111/j.1469-7610.1986.tb00190.x – volume: 275 start-page: 50 year: 2018 ident: 19712_CR26 publication-title: Neurocomputing doi: 10.1016/j.neucom.2017.05.013 – volume: 2 start-page: 59 year: 1995 ident: 19712_CR2 publication-title: Vis. Cogn. doi: 10.1080/13506289508401722 – ident: 19712_CR56 doi: 10.1037/t11529-000 – volume: 1 start-page: 1 year: 2018 ident: 19712_CR43 publication-title: IEEE Trans. Affect. Comput. doi: 10.1109/TAFFC.2018.2868196 – volume: 6 start-page: 25 year: 2017 ident: 19712_CR50 publication-title: J. Hum. Robot Interact. doi: 10.5898/JHRI.6.1.Admoni – volume: 6 start-page: 1 year: 2015 ident: 19712_CR16 publication-title: Front. Psychol. – volume: 82 start-page: 1738 year: 2011 ident: 19712_CR19 publication-title: Child Dev. doi: 10.1111/j.1467-8624.2011.01670.x – volume: 46 start-page: 2464 year: 2016 ident: 19712_CR54 publication-title: J. Autism Dev. Disord. doi: 10.1007/s10803-016-2782-9 – volume: 66 start-page: 81 year: 2015 ident: 19712_CR52 publication-title: Pattern Recognit. Lett. doi: 10.1016/j.patrec.2014.10.002 – volume: 31 start-page: 971 year: 2019 ident: 19712_CR13 publication-title: Dev. Psychopathol. doi: 10.1017/S0954579419000427 – ident: 19712_CR49 doi: 10.1145/3172944.3173010 – volume: 10 start-page: 1 year: 2015 ident: 19712_CR17 publication-title: PLoS ONE – volume: 42 start-page: 60 year: 2017 ident: 19712_CR33 publication-title: Med. Image Anal. doi: 10.1016/j.media.2017.07.005 – volume: 10 start-page: e0139346 year: 2015 ident: 19712_CR14 publication-title: PLoS ONE doi: 10.1371/journal.pone.0139346 – ident: 19712_CR51 doi: 10.1007/978-3-319-32552-1_72 – volume: 47 start-page: 607 year: 2017 ident: 19712_CR23 publication-title: J. Autism Dev. Disord. doi: 10.1007/s10803-016-2981-4 – volume: 16 start-page: 454 year: 1980 ident: 19712_CR1 publication-title: Dev. Psychol. doi: 10.1037/0012-1649.16.5.454 – volume: 102 start-page: 17245 year: 2005 ident: 19712_CR3 publication-title: Proc. Natl Acad. Sci. USA doi: 10.1073/pnas.0502205102 – ident: 19712_CR30 – volume: 8 start-page: 4271 year: 2018 ident: 19712_CR18 publication-title: Sci. Rep. doi: 10.1038/s41598-018-22726-7 – ident: 19712_CR46 doi: 10.1109/CVPR.2016.239 – volume: 31 start-page: 131 year: 2001 ident: 19712_CR57 publication-title: J. Autism Dev. Disord. doi: 10.1023/A:1010738829569 – volume: 316 start-page: 2402 year: 2016 ident: 19712_CR34 publication-title: JAMA doi: 10.1001/jama.2016.17216 – volume: 46 start-page: 2855 year: 2008 ident: 19712_CR12 publication-title: Neuropsychologia doi: 10.1016/j.neuropsychologia.2008.05.003 – volume: 88 start-page: 2060 year: 2017 ident: 19712_CR20 publication-title: Child Dev. doi: 10.1111/cdev.12730 – volume: 25 start-page: 65 year: 2019 ident: 19712_CR38 publication-title: Nat. Med. doi: 10.1038/s41591-018-0268-3 – ident: 19712_CR41 doi: 10.1155/2014/935686 – volume: 16 start-page: 269 year: 2007 ident: 19712_CR7 publication-title: Curr. Dir. Psychol. Sci. doi: 10.1111/j.1467-8721.2007.00518.x – volume: 27 start-page: 1825 year: 2017 ident: 19712_CR4 publication-title: Curr. Biol. doi: 10.1016/j.cub.2017.05.044 – volume: 542 start-page: 115 year: 2017 ident: 19712_CR35 publication-title: Nature doi: 10.1038/nature21056 – volume: 115 start-page: 11591 year: 2018 ident: 19712_CR37 publication-title: Proc. Natl Acad. Sci. USA doi: 10.1073/pnas.1806905115 – ident: 19712_CR5 doi: 10.2307/2786027 – ident: 19712_CR61 doi: 10.1109/CVPRW.2018.00281 – ident: 19712_CR55 – volume: 22 start-page: 1345 year: 2009 ident: 19712_CR28 publication-title: IEEE Trans. Knowl. Data Eng. doi: 10.1109/TKDE.2009.191 – ident: 19712_CR31 doi: 10.1145/2818346.2820760 – volume: 47 start-page: 898 year: 2017 ident: 19712_CR22 publication-title: J. Autism Dev. Disord. doi: 10.1007/s10803-016-3002-3 – volume: 21 start-page: 139 year: 2006 ident: 19712_CR11 publication-title: J. Child Neurol. doi: 10.1177/08830738060210021901 – ident: 19712_CR48 – volume: 15 start-page: 657 year: 1987 ident: 19712_CR63 publication-title: J. Pharmacokinet. Biopharm. doi: 10.1007/BF01068419 – volume: 6 start-page: 361 year: 2015 ident: 19712_CR42 publication-title: IEEE Trans. Affect. Comput. doi: 10.1109/TAFFC.2015.2422702 – ident: 19712_CR40 – ident: 19712_CR21 – ident: 19712_CR60 doi: 10.1109/CVPR.2017.167 – ident: 19712_CR59 doi: 10.1145/2578153.2578190 – volume: 138 start-page: 1 year: 2015 ident: 19712_CR25 publication-title: Comput. Vis. Image Underst. doi: 10.1016/j.cviu.2015.03.015 – volume: 44 start-page: 2400 year: 2014 ident: 19712_CR64 publication-title: J. Autism Dev. Disord. doi: 10.1007/s10803-012-1719-1 – ident: 19712_CR45 doi: 10.1109/CVPR.2018.00230 – volume: 41 start-page: 162 year: 2019 ident: 19712_CR47 publication-title: IEEE Trans. Pattern Anal. Mach. Intell. doi: 10.1109/TPAMI.2017.2778103 – ident: 19712_CR39 doi: 10.1145/2370216.2370264 – ident: 19712_CR58 doi: 10.1109/CVPR.2016.90 – ident: 19712_CR27 doi: 10.1007/978-3-030-18114-7_18 – volume: 1 start-page: 43 year: 2017 ident: 19712_CR24 publication-title: Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. doi: 10.1145/3131902 – volume: 35 start-page: 303 year: 2017 ident: 19712_CR36 publication-title: Med. Image Anal. doi: 10.1016/j.media.2016.07.007 – ident: 19712_CR15 – volume: 20 start-page: 37 year: 1960 ident: 19712_CR62 publication-title: Educ. Psychol. Meas. doi: 10.1177/001316446002000104 – volume: 38 start-page: 283 year: 1991 ident: 19712_CR10 publication-title: Am. J. Med. Genet. doi: 10.1002/ajmg.1320380223 – volume: 74 start-page: 195 year: 2013 ident: 19712_CR9 publication-title: Biol. Psychiatry doi: 10.1016/j.biopsych.2012.11.022 – ident: 19712_CR53 – ident: 19712_CR32 doi: 10.1109/FG.2015.7163127 |
SSID | ssj0000391844 |
Score | 2.4966044 |
Snippet | Eye contact is among the most primary means of social communication used by humans. Quantification of eye contact is valuable as a part of the analysis of... Eye contact is a key social behavior and its measurement could facilitate the diagnosis and treatment of autism. Here the authors show that a deep neural... |
SourceID | doaj pubmedcentral proquest pubmed crossref springer |
SourceType | Open Website Open Access Repository Aggregation Database Index Database Enrichment Source Publisher |
StartPage | 6386 |
SubjectTerms | 639/705/117 706/689/477 Artificial neural networks Autism Autism Spectrum Disorder Child, Preschool Coders Communication Communication skills Deep Learning Diagnosis Eye Eye contact Female Hand Humanities and Social Sciences Humans Infant Machine Learning Male Models, Theoretical multidisciplinary Neural networks Neural Networks, Computer Recall Science Science (multidisciplinary) Social behavior |
SummonAdditionalLinks | – databaseName: DOAJ Directory of Open Access Journals dbid: DOA link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1bi9UwEB5kQfBFXK_VVSL4pmFzbdrH9bIsgj6IC_sW0kmCC9Kz2B7Y_ffm0nPc4_VF6ENp0hK-zCQzzcw3AC8k10EOLaPOY0-V05w67g0NfcTk42aTPecOf_jYnpyq92f67FqprxwTVumBK3CHUUijtUQjkyvTRd0NUZvoUXiRRKvmkac975ozVdZg2SfXRS1ZMkx2h5Mqa0L2lnhvuKCXOztRIez_nZX5a7DkTyemZSM6vgO3FwuSHNWR78ONMN6Fm7Wm5NU9-PQ2zCW8aiSrSMJVIDka3eFM8i9X4kO4IJnEMn1irCHgEzmfiEsX4joTR-T7UrqPFPr_eboPp8fvPr85oUvhBIrJAJtpMtqMiUxiLimWkOOdZywaFqTCZD55H7UL0fcSUUc-cD1gH7kzsQ3d0GKQD2BvXI3hERDBlOtyhqqOqJwIjjM_qN5xjCgdiw3wDYgWF1bxXNziqy2n27KzFXibgLcFeHvZwMvtOxeVU-OvvV_nudn2zHzY5UGSErtIif2XlDRwsJlZuyjpZIVK0CTzWIkGnm-bk3rlMxM3htW69MmHz1r2DTysgrAdiZQJWdWpBsyOiOwMdbdlPP9SKLyNka02soFXG2H6Maw_Q_H4f0DxBG6JogWCcnUAe_O3dXiaDKt5eFZ06Dvyzx11 priority: 102 providerName: Directory of Open Access Journals – databaseName: Health & Medical Collection dbid: 7X7 link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1bi9UwEB50RfBFvNt1lQi-adikSZr2Sbwti6AP4sJ5C2kuuiDtcdsDu_9-M2lPl-NloQ-lSUsymZlOZibfALwSXAXRVoxa7xoqreLUcq9paKJLe1w02fHs8Jev1fGJ_LxSq9nhNsxplVudmBW17x36yA9LqRO3JOO-fLv-TbFqFEZX5xIaN-EWQpdhSpde6cXHgujntZTzWRkm6sNBZs2AeybeaF7S853_UYbt_5et-XfK5B9x0_w7OroHd2c7krybFv4-3AjdA7g9VZa8eAjfPoYxJ1l1pI8kXASCOenWjQQdr8SHsCYIZZk-0U2J4AM5HYhNl3MbhI_A-1zAj-QiAOPwCE6OPn3_cEzn8gnUJTNspMl00zoy4bCwmEvUqz1jUbMgpEtGlPdR2RB9I5xTkbdcta6J3OpYhbqtXBCPYa_ru_AUSMmkrfGcqopO2jJYznwrG8tddMKyWADfEtG4GVscS1z8MjnGLWozEd4kwptMeHNewOvlnfWErHFt7_e4NktPRMXOD_qzH2YWMhPTjJXCuaZtbx1V3Ualo3elL5MaYrqAg-3KmllUB3PFWAW8XJqTkGHkxHah3-Q-GIJWoingycQIy0iESJSVtSxA77DIzlB3W7rTnxnIW2tRKS0KeLNlpqth_Z8U-9fP4hncKTN_l5TLA9gbzzbheTKcxvZFlo5Le94V1A priority: 102 providerName: ProQuest – databaseName: Springer Nature OA Free Journals dbid: C6C link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1baxUxEB5KRfBFtN7WXojgmy7mutl91KOlCPogFvoWsrloQfaU7h5o_30z2YucWgVhH5ZNsiSTmeRLMvkG4LVgKoi2oqX1rimlVay0zOsyNNGlNS5Cdrw7_OVrdXIqP5-psx3g812Y7LSfKS3zMD17h73rZTZpXOywRjNeJtx4D6nbUatX1WrZV0HG81rK6X4MFfUdRbfmoEzVfxe-_NNN8tZZaZ6Cjh_Bwwk7kvdjbR_DTuj24P4YTfL6CXz7GIbsWNWRdSThOhD0Q7duILjZSnwIFwTpK9MvutH5uyfnPbHpcW6DlBH4noP2kUz8P_RP4fT40_fVSTmFTChdgl5DmeCa1pEKh8HEnE4G6ymNmgYhXQJO3kdlQ_SNcE5F1jLVuiYyq2MV6rZyQTyD3W7dhRdAOJW2xrupKjppebCM-lY2lrnohKWxADYL0biJTxzDWvwy-Vxb1GYUvEmCN1nw5qqAN0uZi5FN45-5P2DfLDmRCTt_WF_-MJNmmJharBS2NS1166jqNiodveOep6GH6gIO5p41k3n2hsskmgSMJS_g1ZKcDAtPS2wX1pucB4-dlWgKeD4qwlITIZJkZS0L0FsqslXV7ZTu_Gcm79ZaVEqLAt7OyvS7Wn8Xxcv_y74PD3jWd14yeQC7w-UmHCbwNLRH2VpuALP2FCI priority: 102 providerName: Springer Nature |
Title | Detection of eye contact with deep neural networks is as accurate as human experts |
URI | https://link.springer.com/article/10.1038/s41467-020-19712-x https://www.ncbi.nlm.nih.gov/pubmed/33318484 https://www.proquest.com/docview/2473191642 https://www.proquest.com/docview/2470280539 https://pubmed.ncbi.nlm.nih.gov/PMC7736573 https://doaj.org/article/f237553c732548f58bf57fdc2d279307 |
Volume | 11 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV3da9swED-6lsFexr7nrgsa7G3zZlmSZT-MkWbNSqBldAvkzcj62ArFaWMHmv9-J9nJyJaNgbGNLQv55zvpdz7pDuA1o8KyKktiZXQRcyVorKiRsS2cRhvXU3a_dvjsPDud8slMzPZgne6oB7DZadr5fFLTxdW725vVR1T4D92S8fx9w4O6e0OIFpKmMXLKAxyZpFfUs57uh56ZFWjQeEdzmnAa49jN-nU0u6vZGqtCSP9dPPTP6ZS_-VTDUDV-APd7jkmGnVA8hD1bP4K7XdbJ1WO4-GTbMAGrJnNH7MoSP19d6Zb4n7LEWHtNfJhLrKLuJok35LIhCjetlz60hD8Pyf1ISBDQNk9gOj75NjqN-9QKsUaK1sZI66R0CdM-6ZiWqNgmSZxMLOMaCZYxTijrTMG0Fo5WVFS6cFRJl9m8yrRlT2G_ntf2ORCEUuV-DatwmqvUKpqYiheKaqeZSlwEdA1iqfu44z79xVUZ_N8sLzvgSwS-DMCXtxG82Txz3UXd-GfpY_9tNiV9xOxwYb74XvYKWDp8YyH8u6JJnDuRV05IZ3RqUuyiEhnB0frLlmspLFOO0CCB5mkErza3UQG9V0XVdr4MZbx7WrAigmedIGxawhgiy3MegdwSka2mbt-pL3-EIN9SskxIFsHbtTD9atbfoTj8j2a-gHtpEPI0pvwI9tvF0r5EZtVWA7gjZxL3-fjzAA6Gw8nXCR6PT86_XODVUTYahH8Wg6BWPwFtdyPi |
linkProvider | Scholars Portal |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Lb9QwELZKEaIXxJtAASPBCaL6uU4OCAGl2tLHAbXS3ozjB1SqskuTFd0_xW_E4yRbLY_eKuUQxY5lj2fGY894PoRecio9r0YkN86WuTCS5oY6lfsy2LjHBZMd7g4fHI7Gx-LzRE7W0K_hLgyEVQ46MSlqN7VwRr7FhIrcEo179m72IwfUKPCuDhAaHVvs-cXPuGVr3u5ux_l9xdjOp6OP47xHFchttE7aPFo0SgXCLeBt2dho4QgJingubLQtnAvS-OBKbq0MtKKysmWgRoWRL6qR9Ty2ew1djwsvAYlSE7U804Fs64UQ_d0cwoutRiRNBHs0WirK8vOV9S_BBPzLtv07RPMPP21a_nZuo1u93Yrfd4x2B635-i660SFZLu6hL9u-TUFdNZ4G7BceQwy8sS2Gg17svJ9hSJ0Zm6i7wPMGnzTYxMfaOaSrgPcEGIgT6EDb3EfHV0LYB2i9ntb-EcKMCFPAvVgZrDDMG0pcJUpDbbDckJAhOhBR2z6XOUBqnOrkU-eF7givI-F1Irw-z9Dr5T-zLpPHpbU_wNwsa0IW7vRhevZN90KtQxyxlDDWuM0ugiyqIFVwljkW1R5RGdocZlb3qqHRF4ycoRfL4ijU4KkxtZ_OUx1weUteZuhhxwjLnnAeKSsKkSG1wiIrXV0tqU--p8ThSvGRVDxDbwZmuujW_0nx-PJRPEc3x0cH-3p_93DvCdpgiddZTsUmWm_P5v5pNNra6lmSFIy-XrVo_gYF8VOZ |
linkToPdf | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Lb9QwEB6VIhAXxJtAASPBCaKNY3udHBACllVLoUKISnszjh9QCWWXJiu6f41fh8dJtloevVXKIYqdxB7PjMee8XwATxgVjlXjLNXWlCnXgqaaWpm60puwxkWTHc8OfzgY7x7ydzMx24Jfw1kYDKscdGJU1HZucI98lHMZuCUY9_nI92ERHyfTl4sfKSJIoad1gNPoWGTfrX6G5VvzYm8Sxvppnk_ffn6zm_YIA6kJlkqbButGSp8xg9hbJvygsFnmZeYYN8HOsNYL7bwtmTHC04qKypSeaunHrqjGxrHw3QtwUTJBUcbkTK73dzDzesF5f04nY8Wo4VEr4XqNlpLm6cnGXBghA_5l5_4drvmHzzZOhdNrcLW3Ycmrjumuw5arb8ClDtVydRM-TVwbA7xqMvfErRzBeHhtWoKbvsQ6tyCYRjN8ou6C0Bty1BAdLmOWmLoC7yN4IIkABG1zCw7PhbC3Ybue1-4ukDzjusAzssIbrnOnaWYrXmpqvGE68wnQgYjK9HnNEV7ju4r-dVaojvAqEF5FwquTBJ6t31l0WT3OrP0ax2ZdEzNyxwfz46-qF3DlQ4-FwL6GJXfhRVF5Ib01uc2DCsxkAjvDyKpeTTTqlKkTeLwuDgKOXhtdu_ky1kH3t2BlAnc6Rli3hLFAWV7wBOQGi2w0dbOkPvoWk4hLycZCsgSeD8x02qz_k-Le2b14BJeDUKr3ewf79-FKHlk9Tynfge32eOkeBPutrR5GQSHw5bwl8zfcU1fP |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Detection+of+eye+contact+with+deep+neural+networks+is+as+accurate+as+human+experts&rft.jtitle=Nature+communications&rft.au=Chong%2C+Eunji&rft.au=Clark-Whitney%2C+Elysha&rft.au=Southerland%2C+Audrey&rft.au=Stubbs%2C+Elizabeth&rft.date=2020-12-14&rft.issn=2041-1723&rft.eissn=2041-1723&rft.volume=11&rft.issue=1&rft.spage=6386&rft_id=info:doi/10.1038%2Fs41467-020-19712-x&rft.externalDBID=NO_FULL_TEXT |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2041-1723&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2041-1723&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2041-1723&client=summon |