Deep Neural Networks for Chronological Age Estimation From OPG Images
Chronological age estimation is crucial labour in many clinical procedures, where the teeth have proven to be one of the best estimators. Although some methods to estimate the age from tooth measurements in orthopantomogram (OPG) images have been developed, they rely on time-consuming manual process...
Saved in:
Published in | IEEE transactions on medical imaging Vol. 39; no. 7; pp. 2374 - 2384 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
United States
IEEE
01.07.2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | Chronological age estimation is crucial labour in many clinical procedures, where the teeth have proven to be one of the best estimators. Although some methods to estimate the age from tooth measurements in orthopantomogram (OPG) images have been developed, they rely on time-consuming manual processes whose results are affected by the observer subjectivity. Furthermore, all those approaches have been tested only on OPG image sets of good radiological quality without any conditioning dental characteristic. In this work, two fully automatic methods to estimate the chronological age of a subject from the OPG image are proposed. The first (DANet) consists of a sequential Convolutional Neural Network (CNN) path to predict the age, while the second (DASNet) adds a second CNN path to predict the sex and uses sex-specific features with the aim of improving the age prediction performance. Both methods were tested on a set of 2289 OPG images of subjects from 4.5 to 89.2 years old, where both bad radiological quality images and images showing conditioning dental characteristics were not discarded. The results showed that the DASNet outperforms the DANet in every aspect, reducing the median Error (E) and the median Absolute Error (AE) by about 4 months in the entire database. When evaluating the DASNet in the reduced datasets, the AE values decrease as the real age of the subjects decreases, until reaching a median of about 8 months in the subjects younger than 15. The DASNet method was also compared to the state-of-the-art manual age estimation methods, showing significantly less over- or under-estimation problems. Consequently, we conclude that the DASNet can be used to automatically predict the chronological age of a subject accurately, especially in young subjects with developing dentitions. |
---|---|
AbstractList | Chronological age estimation is crucial labour in many clinical procedures, where the teeth have proven to be one of the best estimators. Although some methods to estimate the age from tooth measurements in orthopantomogram (OPG) images have been developed, they rely on time-consuming manual processes whose results are affected by the observer subjectivity. Furthermore, all those approaches have been tested only on OPG image sets of good radiological quality without any conditioning dental characteristic. In this work, two fully automatic methods to estimate the chronological age of a subject from the OPG image are proposed. The first (DANet) consists of a sequential Convolutional Neural Network (CNN) path to predict the age, while the second (DASNet) adds a second CNN path to predict the sex and uses sex-specific features with the aim of improving the age prediction performance. Both methods were tested on a set of 2289 OPG images of subjects from 4.5 to 89.2 years old, where both bad radiological quality images and images showing conditioning dental characteristics were not discarded. The results showed that the DASNet outperforms the DANet in every aspect, reducing the median Error (E) and the median Absolute Error (AE) by about 4 months in the entire database. When evaluating the DASNet in the reduced datasets, the AE values decrease as the real age of the subjects decreases, until reaching a median of about 8 months in the subjects younger than 15. The DASNet method was also compared to the state-of-the-art manual age estimation methods, showing significantly less over- or under-estimation problems. Consequently, we conclude that the DASNet can be used to automatically predict the chronological age of a subject accurately, especially in young subjects with developing dentitions.Chronological age estimation is crucial labour in many clinical procedures, where the teeth have proven to be one of the best estimators. Although some methods to estimate the age from tooth measurements in orthopantomogram (OPG) images have been developed, they rely on time-consuming manual processes whose results are affected by the observer subjectivity. Furthermore, all those approaches have been tested only on OPG image sets of good radiological quality without any conditioning dental characteristic. In this work, two fully automatic methods to estimate the chronological age of a subject from the OPG image are proposed. The first (DANet) consists of a sequential Convolutional Neural Network (CNN) path to predict the age, while the second (DASNet) adds a second CNN path to predict the sex and uses sex-specific features with the aim of improving the age prediction performance. Both methods were tested on a set of 2289 OPG images of subjects from 4.5 to 89.2 years old, where both bad radiological quality images and images showing conditioning dental characteristics were not discarded. The results showed that the DASNet outperforms the DANet in every aspect, reducing the median Error (E) and the median Absolute Error (AE) by about 4 months in the entire database. When evaluating the DASNet in the reduced datasets, the AE values decrease as the real age of the subjects decreases, until reaching a median of about 8 months in the subjects younger than 15. The DASNet method was also compared to the state-of-the-art manual age estimation methods, showing significantly less over- or under-estimation problems. Consequently, we conclude that the DASNet can be used to automatically predict the chronological age of a subject accurately, especially in young subjects with developing dentitions. Chronological age estimation is crucial labour in many clinical procedures, where the teeth have proven to be one of the best estimators. Although some methods to estimate the age from tooth measurements in orthopantomogram (OPG) images have been developed, they rely on time-consuming manual processes whose results are affected by the observer subjectivity. Furthermore, all those approaches have been tested only on OPG image sets of good radiological quality without any conditioning dental characteristic. In this work, two fully automatic methods to estimate the chronological age of a subject from the OPG image are proposed. The first (DANet) consists of a sequential Convolutional Neural Network (CNN) path to predict the age, while the second (DASNet) adds a second CNN path to predict the sex and uses sex-specific features with the aim of improving the age prediction performance. Both methods were tested on a set of 2289 OPG images of subjects from 4.5 to 89.2 years old, where both bad radiological quality images and images showing conditioning dental characteristics were not discarded. The results showed that the DASNet outperforms the DANet in every aspect, reducing the median Error (E) and the median Absolute Error (AE) by about 4 months in the entire database. When evaluating the DASNet in the reduced datasets, the AE values decrease as the real age of the subjects decreases, until reaching a median of about 8 months in the subjects younger than 15. The DASNet method was also compared to the state-of-the-art manual age estimation methods, showing significantly less over- or under-estimation problems. Consequently, we conclude that the DASNet can be used to automatically predict the chronological age of a subject accurately, especially in young subjects with developing dentitions. |
Author | Tomas, Inmaculada Balsa-Castro, Carlos Carreira, Maria J. Vila-Blanco, Nicolas Varas-Quintana, Paulina |
Author_xml | – sequence: 1 givenname: Nicolas orcidid: 0000-0001-5865-9973 surname: Vila-Blanco fullname: Vila-Blanco, Nicolas email: nicolas.vila@usc.es organization: Centro Singular de Investigación en Tecnoloxías Intelixentes, Universidade de Santiago de Compostela, Santiago de Compostela, Spain – sequence: 2 givenname: Maria J. orcidid: 0000-0003-0532-2351 surname: Carreira fullname: Carreira, Maria J. email: mariajose.carreira@usc.es organization: Centro Singular de Investigación en Tecnoloxías Intelixentes, Universidade de Santiago de Compostela, Santiago de Compostela, Spain – sequence: 3 givenname: Paulina orcidid: 0000-0003-1333-0810 surname: Varas-Quintana fullname: Varas-Quintana, Paulina email: paulina.varas@rai.usc.es organization: Health Research Institute Foundation of Santiago (FIDIS), Santiago de Compostela, Spain – sequence: 4 givenname: Carlos orcidid: 0000-0001-7937-6673 surname: Balsa-Castro fullname: Balsa-Castro, Carlos email: cbalsa@coitt.es organization: Health Research Institute Foundation of Santiago (FIDIS), Santiago de Compostela, Spain – sequence: 5 givenname: Inmaculada surname: Tomas fullname: Tomas, Inmaculada email: inmaculada.tomas@usc.es organization: Health Research Institute Foundation of Santiago (FIDIS), Santiago de Compostela, Spain |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/32012002$$D View this record in MEDLINE/PubMed |
BookMark | eNp9kUFv1DAQhS1URLeFOxISisSFS5bx2E7sY7Vsy0qFcigSN8vrTJaUJN7aiRD_HpfdcuiB0xzme08z752xkzGMxNhrDkvOwXy4_bxZIiAs0VS6rtQztuBK6RKV_H7CFoC1LgEqPGVnKd0BcKnAvGCnAoEjAC7Y-iPRvvhCc3R9HtOvEH-mog2xWP2IYQx92HU-ry52VKzT1A1u6sJYXMYwFDdfr4rN4HaUXrLnresTvTrOc_btcn27-lRe31xtVhfXpZdKTWWjGy0bp4SkWhqsnSEEo6gR6LVALVvlSGtP6AU50eqtM4Bac-M8J7EV5-z9wXcfw_1MabJDlzz1vRspzMmiUKCNkQIz-u4JehfmOObrLEpupJFVDZl6e6Tm7UCN3cf8YfxtHwPKQHUAfAwpRWqt76a_GUzRdb3lYB-asLkJ-9CEPTaRhfBE-Oj9H8mbg6Qjon-4NnWtQIo_6RKQpg |
CODEN | ITMID4 |
CitedBy_id | crossref_primary_10_1007_s00194_025_00751_x crossref_primary_10_1007_s00330_025_11373_y crossref_primary_10_1186_s12903_023_02817_2 crossref_primary_10_1007_s00414_022_02796_z crossref_primary_10_1016_j_legalmed_2020_101826 crossref_primary_10_1007_s10006_025_01334_6 crossref_primary_10_1109_ACCESS_2024_3466953 crossref_primary_10_1109_ACCESS_2022_3187959 crossref_primary_10_1007_s11042_023_17776_7 crossref_primary_10_3390_computation11020018 crossref_primary_10_1109_JBHI_2023_3297610 crossref_primary_10_35414_akufemubid_1451334 crossref_primary_10_1111_1556_4029_15473 crossref_primary_10_1080_03772063_2023_2165177 crossref_primary_10_1007_s00414_024_03162_x crossref_primary_10_1007_s00784_024_05598_2 crossref_primary_10_7759_cureus_73028 crossref_primary_10_1016_j_eswa_2022_116968 crossref_primary_10_3390_jimaging10100239 crossref_primary_10_1016_j_compbiomed_2022_106072 crossref_primary_10_3390_oral5010003 crossref_primary_10_1016_j_forsciint_2023_111704 crossref_primary_10_1007_s10278_023_00956_0 crossref_primary_10_3390_electronics12143066 crossref_primary_10_1007_s00414_024_03167_6 crossref_primary_10_1186_s41935_022_00314_1 crossref_primary_10_1038_s41598_023_48960_2 crossref_primary_10_3390_diagnostics15030314 crossref_primary_10_1038_s41598_023_27950_4 crossref_primary_10_3390_healthcare11141979 crossref_primary_10_1007_s00414_024_03204_4 crossref_primary_10_1007_s00414_025_03452_y crossref_primary_10_3390_healthcare11081068 crossref_primary_10_1016_j_arr_2023_102144 crossref_primary_10_1111_1556_4029_15629 crossref_primary_10_3390_jcm12030937 crossref_primary_10_1186_s12859_022_04935_0 crossref_primary_10_1186_s12903_022_02652_x crossref_primary_10_1016_j_jflm_2024_102679 crossref_primary_10_32604_cmc_2023_029914 crossref_primary_10_4103_sjhs_sjhs_124_23 crossref_primary_10_17694_bajece_1351546 crossref_primary_10_1007_s00521_022_07981_0 crossref_primary_10_1007_s00414_025_03432_2 crossref_primary_10_1186_s12903_024_03928_0 crossref_primary_10_1186_s12903_021_01996_0 crossref_primary_10_3390_app14167014 crossref_primary_10_1016_j_compbiomed_2024_108927 crossref_primary_10_3390_healthcare12131311 crossref_primary_10_1259_dmfr_20220363 crossref_primary_10_1007_s00414_020_02489_5 crossref_primary_10_1016_j_media_2022_102470 crossref_primary_10_1038_s41746_022_00681_y crossref_primary_10_18231_j_idjsr_2023_012 crossref_primary_10_1007_s11042_023_17048_4 crossref_primary_10_1007_s00414_023_03140_9 crossref_primary_10_18231_j_ijfmts_2024_014 crossref_primary_10_1109_JSEN_2023_3334555 crossref_primary_10_3390_app13063860 crossref_primary_10_3389_fpubh_2022_1068253 crossref_primary_10_5937_bjdm2303181A crossref_primary_10_1080_08839514_2022_2073724 crossref_primary_10_1007_s00414_022_02928_5 crossref_primary_10_1016_j_eswa_2021_116038 crossref_primary_10_3390_diagnostics14080806 crossref_primary_10_1007_s11548_021_02474_2 crossref_primary_10_4274_mirt_galenos_2022_63644 crossref_primary_10_1007_s00414_023_02960_z crossref_primary_10_1016_j_neucom_2022_09_080 crossref_primary_10_3390_biology12111403 crossref_primary_10_1007_s10489_023_05096_0 crossref_primary_10_1111_coin_12660 crossref_primary_10_1016_j_compmedimag_2024_102329 crossref_primary_10_1038_s41598_024_54877_1 crossref_primary_10_1007_s42452_023_05503_8 crossref_primary_10_18231_j_ijfmts_2024_026 crossref_primary_10_3390_ijerph20054620 crossref_primary_10_1038_s41598_024_70621_1 |
Cites_doi | 10.1109/EMBC.2018.8512755 10.1016/j.forsciint.2017.08.032 10.1142/9789812772558_0013 10.1111/j.1600-0722.1991.tb01029.x 10.4258/hir.2018.24.3.236 10.1002/ajpa.1330130206 10.1259/dmfr.20170362 10.1177/00220345630420062701 10.1007/s00414-004-0489-5 10.1016/j.forsciint.2007.03.009 10.3109/00016358608997720 10.1177/00220345590380010601 10.1001/jama.2013.281053 10.1016/j.forsciint.2016.11.009 10.1016/j.media.2017.07.005 10.1186/1472-6831-14-160 10.1016/S0379-0738(00)00154-7 10.1016/j.future.2019.01.057 10.1016/j.jflm.2015.09.001 10.1109/5.726791 10.1080/00085030.2017.1281632 10.1109/EMBC.2018.8512732 10.1016/j.jflm.2015.11.011 10.1007/s00414-010-0515-8 10.1007/s00414-008-0279-6 10.1080/00016359850142862 10.1007/s11749-018-0611-5 10.1016/j.forsciint.2006.02.019 10.1080/03014467600001671 10.1017/CBO9780511542565.004 10.1016/j.jflm.2017.03.005 10.1016/j.forsciint.2007.09.001 10.1016/j.eswa.2018.04.001 10.1007/s00414-005-0047-9 10.1016/S1353-1131(99)90170-0 10.1166/jmihi.2017.2257 10.1016/S0379-0738(98)00034-6 10.1007/s00414-007-0179-1 10.1016/j.forsciint.2016.10.001 10.1093/ejo/cjn081 10.1093/ejo/7.1.25 10.1111/j.1834-7819.2002.tb00333.x 10.1109/ICCV.2017.74 10.1007/s00414-009-0380-5 10.1111/j.1556-4029.2008.00778.x 10.1259/dmfr.20180051 10.1016/j.jflm.2018.07.014 10.1007/s00414-005-0530-3 10.1016/j.forsciint.2008.03.014 10.1177/00220345730520030701 10.1109/SIBGRAPI.2018.00058 10.1016/j.media.2016.10.010 |
ContentType | Journal Article |
Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2020 |
Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2020 |
DBID | 97E ESBDL RIA RIE AAYXX CITATION NPM 7QF 7QO 7QQ 7SC 7SE 7SP 7SR 7TA 7TB 7U5 8BQ 8FD F28 FR3 H8D JG9 JQ2 KR7 L7M L~C L~D NAPCQ P64 7X8 |
DOI | 10.1109/TMI.2020.2968765 |
DatabaseName | Accès UT - IEEE All-Society Periodicals Package (ASPP) 2005–Present IEEE Xplore Open Access Journals IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE/IET Electronic Library CrossRef PubMed Aluminium Industry Abstracts Biotechnology Research Abstracts Ceramic Abstracts Computer and Information Systems Abstracts Corrosion Abstracts Electronics & Communications Abstracts Engineered Materials Abstracts Materials Business File Mechanical & Transportation Engineering Abstracts Solid State and Superconductivity Abstracts METADEX Technology Research Database ANTE: Abstracts in New Technology & Engineering Engineering Research Database Aerospace Database Materials Research Database ProQuest Computer Science Collection Civil Engineering Abstracts Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional Nursing & Allied Health Premium Biotechnology and BioEngineering Abstracts MEDLINE - Academic |
DatabaseTitle | CrossRef PubMed Materials Research Database Civil Engineering Abstracts Aluminium Industry Abstracts Technology Research Database Computer and Information Systems Abstracts – Academic Mechanical & Transportation Engineering Abstracts Electronics & Communications Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Ceramic Abstracts Materials Business File METADEX Biotechnology and BioEngineering Abstracts Computer and Information Systems Abstracts Professional Aerospace Database Nursing & Allied Health Premium Engineered Materials Abstracts Biotechnology Research Abstracts Solid State and Superconductivity Abstracts Engineering Research Database Corrosion Abstracts Advanced Technologies Database with Aerospace ANTE: Abstracts in New Technology & Engineering MEDLINE - Academic |
DatabaseTitleList | MEDLINE - Academic Materials Research Database PubMed |
Database_xml | – sequence: 1 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 2 dbid: RIE name: IEEE/IET Electronic Library url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/ sourceTypes: Publisher |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Medicine Engineering Dentistry |
EISSN | 1558-254X |
EndPage | 2384 |
ExternalDocumentID | 32012002 10_1109_TMI_2020_2968765 8977504 |
Genre | orig-research Research Support, Non-U.S. Gov't Journal Article |
GrantInformation_xml | – fundername: Potential grantid: ED431B 2017/029 – fundername: European Regional Development Fund (ERDF) funderid: 10.13039/501100008530 – fundername: Consellería de Cultura, Educación e Ordenación Universitaria grantid: ED431G/08 funderid: 10.13039/501100008425 – fundername: N Vila-Blanco grantid: ED481A-2017 – fundername: Competitive Reference grantid: ED431C 2017/69 |
GroupedDBID | --- -DZ -~X .GJ 0R~ 29I 4.4 53G 5GY 5RE 5VS 6IK 97E AAJGR AARMG AASAJ AAWTH ABAZT ABQJQ ABVLG ACGFO ACGFS ACIWK ACNCT ACPRK AENEX AETIX AFRAH AGQYO AGSQL AHBIQ AI. AIBXA AKJIK AKQYR ALLEH ALMA_UNASSIGNED_HOLDINGS ASUFR ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ CS3 DU5 EBS EJD ESBDL F5P HZ~ H~9 IBMZZ ICLAB IFIPE IFJZH IPLJI JAVBF LAI M43 MS~ O9- OCL P2P PQQKQ RIA RIE RNS RXW TAE TN5 VH1 AAYOK AAYXX CITATION RIG NPM 7QF 7QO 7QQ 7SC 7SE 7SP 7SR 7TA 7TB 7U5 8BQ 8FD F28 FR3 H8D JG9 JQ2 KR7 L7M L~C L~D NAPCQ P64 7X8 |
ID | FETCH-LOGICAL-c455t-d8d84da534e74927a9e2095ed32c83284f5ae88ce2c3ea3f8ba9028819ac1e3b3 |
IEDL.DBID | RIE |
ISSN | 0278-0062 1558-254X |
IngestDate | Fri Jul 11 04:27:18 EDT 2025 Mon Jun 30 06:32:36 EDT 2025 Thu Apr 03 07:00:38 EDT 2025 Tue Jul 01 03:16:03 EDT 2025 Thu Apr 24 22:54:18 EDT 2025 Wed Aug 27 02:37:54 EDT 2025 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | false |
IsScholarly | true |
Issue | 7 |
Language | English |
License | https://creativecommons.org/licenses/by/4.0/legalcode |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c455t-d8d84da534e74927a9e2095ed32c83284f5ae88ce2c3ea3f8ba9028819ac1e3b3 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
ORCID | 0000-0001-7937-6673 0000-0001-5865-9973 0000-0003-0532-2351 0000-0003-1333-0810 |
OpenAccessLink | https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/document/8977504 |
PMID | 32012002 |
PQID | 2419494670 |
PQPubID | 85460 |
PageCount | 11 |
ParticipantIDs | crossref_primary_10_1109_TMI_2020_2968765 crossref_citationtrail_10_1109_TMI_2020_2968765 proquest_miscellaneous_2350899432 pubmed_primary_32012002 proquest_journals_2419494670 ieee_primary_8977504 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2020-07-01 |
PublicationDateYYYYMMDD | 2020-07-01 |
PublicationDate_xml | – month: 07 year: 2020 text: 2020-07-01 day: 01 |
PublicationDecade | 2020 |
PublicationPlace | United States |
PublicationPlace_xml | – name: United States – name: New York |
PublicationTitle | IEEE transactions on medical imaging |
PublicationTitleAbbrev | TMI |
PublicationTitleAlternate | IEEE Trans Med Imaging |
PublicationYear | 2020 |
Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
References | ref57 ref56 ref59 ref58 ref14 ref11 ref17 ref16 ref19 ref18 tanner (ref2) 1975; 16 ref50 roche (ref3) 1988 ref46 ref45 ref48 ref42 ref44 ref43 ref49 association (ref47) 2013; 310 ref8 ref7 ref4 ref5 ref40 ref35 ref34 ref37 gg (ref15) 1985; 7 ref36 ref31 greulich (ref1) 1959; 2 ref30 ref33 ref32 haavikko (ref10) 1970; 66 ref39 ref38 ioffe (ref53) 2015 nolla (ref6) 1960; 27 glorot (ref54) 2010 zeiler (ref55) 2012 oktay (ref41) 2017 ref24 ref23 ref26 ref25 nair (ref52) 2010 ref64 ref20 demirjian (ref13) 1973; 45 ref63 ref66 ref22 ref65 ref21 lewis (ref12) 1960; 30 ref28 chollet (ref51) 2015 ref27 ref29 capitaneanu (ref67) 2017; 35 ref60 ref62 ref61 haavikko (ref9) 1974; 70 |
References_xml | – ident: ref45 doi: 10.1109/EMBC.2018.8512755 – ident: ref35 doi: 10.1016/j.forsciint.2017.08.032 – year: 2015 ident: ref53 article-title: Batch normalization: Accelerating deep network training by reducing internal covariate shift publication-title: arXiv 1502 03167 – ident: ref56 doi: 10.1142/9789812772558_0013 – ident: ref17 doi: 10.1111/j.1600-0722.1991.tb01029.x – ident: ref46 doi: 10.4258/hir.2018.24.3.236 – ident: ref11 doi: 10.1002/ajpa.1330130206 – ident: ref34 doi: 10.1259/dmfr.20170362 – ident: ref7 doi: 10.1177/00220345630420062701 – ident: ref23 doi: 10.1007/s00414-004-0489-5 – start-page: 807 year: 2010 ident: ref52 article-title: Rectified linear units improve restricted Boltzmann machines publication-title: Proc 27th Int Conf Mach Learn (ICML) – ident: ref25 doi: 10.1016/j.forsciint.2007.03.009 – ident: ref16 doi: 10.3109/00016358608997720 – ident: ref31 doi: 10.1177/00220345590380010601 – volume: 310 start-page: 2191 year: 2013 ident: ref47 article-title: World medical association declaration of Helsinki: Ethical principles for medical research involving human subjects publication-title: JAMA doi: 10.1001/jama.2013.281053 – ident: ref27 doi: 10.1016/j.forsciint.2016.11.009 – ident: ref49 doi: 10.1016/j.media.2017.07.005 – ident: ref48 doi: 10.1186/1472-6831-14-160 – ident: ref21 doi: 10.1016/S0379-0738(00)00154-7 – ident: ref5 doi: 10.1016/j.future.2019.01.057 – ident: ref66 doi: 10.1016/j.jflm.2015.09.001 – year: 1988 ident: ref3 publication-title: Assessing the Skeletal Maturity of the Hand-Wrist FELS Method – volume: 30 start-page: 70 year: 1960 ident: ref12 article-title: The relationship between tooth formation and other maturational factors publication-title: Angle Orthodontist – ident: ref50 doi: 10.1109/5.726791 – ident: ref30 doi: 10.1080/00085030.2017.1281632 – ident: ref44 doi: 10.1109/EMBC.2018.8512732 – ident: ref28 doi: 10.1016/j.jflm.2015.11.011 – volume: 70 start-page: 9 year: 1974 ident: ref9 article-title: Tooth formation age estimated on a few selected teeth. A simple method for clinical use publication-title: Proc Finn Dent Soc – ident: ref26 doi: 10.1007/s00414-010-0515-8 – ident: ref33 doi: 10.1007/s00414-008-0279-6 – ident: ref19 doi: 10.1080/00016359850142862 – ident: ref57 doi: 10.1007/s11749-018-0611-5 – ident: ref24 doi: 10.1016/j.forsciint.2006.02.019 – ident: ref14 doi: 10.1080/03014467600001671 – ident: ref32 doi: 10.1017/CBO9780511542565.004 – start-page: 249 year: 2010 ident: ref54 article-title: Understanding the difficulty of training deep feedforward neural networks publication-title: Proc 13th Int Conf Artif Intell Statist – ident: ref29 doi: 10.1016/j.jflm.2017.03.005 – ident: ref59 doi: 10.1016/j.forsciint.2007.09.001 – volume: 2 year: 1959 ident: ref1 publication-title: Radiographic Atlas of Skeletal Development of the Hand and Wrist – ident: ref39 doi: 10.1016/j.eswa.2018.04.001 – ident: ref8 doi: 10.1007/s00414-005-0047-9 – volume: 66 start-page: 103 year: 1970 ident: ref10 article-title: The formation and the alveolar and clinical eruption of the permanent teeth. An orthopantomographic study publication-title: Suom Hammaslaak Toim – ident: ref20 doi: 10.1016/S1353-1131(99)90170-0 – start-page: 1 year: 2017 ident: ref41 article-title: Tooth detection with convolutional neural networks publication-title: Proc Med Technol Nat Congr (TIPTEKNO) – ident: ref43 doi: 10.1166/jmihi.2017.2257 – ident: ref18 doi: 10.1016/S0379-0738(98)00034-6 – ident: ref38 doi: 10.1007/s00414-007-0179-1 – ident: ref37 doi: 10.1016/j.forsciint.2016.10.001 – year: 2012 ident: ref55 article-title: ADADELTA: An adaptive learning rate method publication-title: arXiv 1212 5701 – volume: 45 start-page: 211 year: 1973 ident: ref13 article-title: A new system of dental age assessment publication-title: Hum Biol – volume: 27 start-page: 66 year: 1960 ident: ref6 article-title: The development of the human dentition publication-title: J Dent Child – volume: 16 year: 1975 ident: ref2 publication-title: Assessment of Skeletal Maturity and Prediction of Adult Height (TW2 Method) – year: 2015 ident: ref51 publication-title: Keras – ident: ref61 doi: 10.1093/ejo/cjn081 – volume: 7 start-page: 25 year: 1985 ident: ref15 article-title: Dental maturity as an indicator of chronological age: The accuracy and precision of three methods publication-title: Eur J Orthodontics doi: 10.1093/ejo/7.1.25 – ident: ref22 doi: 10.1111/j.1834-7819.2002.tb00333.x – ident: ref58 doi: 10.1109/ICCV.2017.74 – ident: ref62 doi: 10.1007/s00414-009-0380-5 – ident: ref63 doi: 10.1111/j.1556-4029.2008.00778.x – ident: ref42 doi: 10.1259/dmfr.20180051 – volume: 35 start-page: 1 year: 2017 ident: ref67 article-title: A systematic review of odontological sex estimation methods publication-title: J Forensic Odonto- Stomat – ident: ref36 doi: 10.1016/j.jflm.2018.07.014 – ident: ref60 doi: 10.1007/s00414-005-0530-3 – ident: ref64 doi: 10.1016/j.forsciint.2008.03.014 – ident: ref65 doi: 10.1177/00220345730520030701 – ident: ref40 doi: 10.1109/SIBGRAPI.2018.00058 – ident: ref4 doi: 10.1016/j.media.2016.10.010 |
SSID | ssj0014509 |
Score | 2.5860882 |
Snippet | Chronological age estimation is crucial labour in many clinical procedures, where the teeth have proven to be one of the best estimators. Although some methods... |
SourceID | proquest pubmed crossref ieee |
SourceType | Aggregation Database Index Database Enrichment Source Publisher |
StartPage | 2374 |
SubjectTerms | Age Age determination Artificial neural networks Biomedical imaging chronological age Chronology Conditioning Convolution Deep learning dental age Dentistry Estimation forensic age Forensic odontology Image quality Manuals Neural networks panoramic images Sex Task analysis Teeth |
Title | Deep Neural Networks for Chronological Age Estimation From OPG Images |
URI | https://ieeexplore.ieee.org/document/8977504 https://www.ncbi.nlm.nih.gov/pubmed/32012002 https://www.proquest.com/docview/2419494670 https://www.proquest.com/docview/2350899432 |
Volume | 39 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1NT9wwEB1RDm050HZpIZQiI_VSqdkN_kjsIyq7BaQFDiBxixx70kPLLmJ3L_x6xnESFdRWvUWKnTgej_0mM_MG4DOdMQrrQoTiLjqVVY6pPrQ-1U6SdYDaV00QzfQ8P7mWZzfqZg2-9rkwiNgEn-EwXDa-fD93q_CrbKQJrKhA_vmCDLeYq9V7DKSK4Rw8MMZmOe9ckpkZXU1PyRDk2ZCbnJRfPTmCmpoqf4eXzTEzeQPTboAxuuTncLWshu7hGXfj_37BW9hs8SY7igvkHazhbACvjkOMUCjzNoCN3xgJB_By2vrat2B8jHjHAnkH9T-P0eILRhiXNYS63abJjn4gG9NGEXMg2eR-fssuLr-z01vaqhbv4Xoyvvp2krZFF1InlVqmXnstvVVCYiENL6xBTjAMveCOtF_LWlnU2iF3Aq2odWUDAQwBC-sOUVTiA6zP5jPcAUa3M6XqvPK8lsYJnVlvhDSFqQh3yjyBUSeH0rWM5KEwxq-ysUwyU5LkyiC5spVcAl_6HneRjeMfbbfC_Pft2qlPYK8Tddmq66IkGGMkjajIEjjob5OiBe-JneF8RW2ECi5SKXgC23GJ9M8WPOQgZ3z3z-_8CK_DyGKU7x6sL-9X-ImwzLLabxbxI62B7HE |
linkProvider | IEEE |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV3db9MwED9Nm8TgYUDHIDDASLwgkTb1R2I_TqylhaXw0El7ixz7wgOsndb2hb-ec74ECBBvkWInjs_n-zl39zuA12RjFFaZCMVddCzLFGM9tj7WTtLpALUv6yCafJHOLuWHK3W1B2_7XBhErIPPcBgua1--X7td-FU20gRWVCD_PCC7r8ZNtlbvM5CqCejggTM2SXnnlEzMaJnP6SjIkyE3Kam_-sUI1VVV_g4wa0MzvQ95N8QmvuTrcLcth-77b-yN__sND-CoRZzsrFkiD2EPVwM4PA9RQqHQ2wDu_cRJOIA7eettP4bJOeINC_Qd1H_RxItvGKFcVlPqdtsmO_uCbEJbRZMFyaa362v26fN7Nr-mzWrzCC6nk-W7WdyWXYgdzeo29tpr6a0SEjNpeGYNcgJi6AV3pP9aVsqi1g65E2hFpUsbKGAIWlg3RlGKE9hfrVf4BBjdTpSq0tLzShondGK9EdJkpiTkKdMIRp0cCtdykofSGN-K-mySmIIkVwTJFa3kInjT97hp-Dj-0fY4zH_frp36CE47URetwm4KAjJG0oiyJIJX_W1SteA_sStc76iNUMFJKgWP4HGzRPpnCx6ykBP-9M_vfAmHs2V-UVzMFx-fwd0wyibm9xT2t7c7fE7IZlu-qBf0DyvV77o |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Deep+Neural+Networks+for+Chronological+Age+Estimation+From+OPG+Images&rft.jtitle=IEEE+transactions+on+medical+imaging&rft.au=Vila-Blanco%2C+Nicolas&rft.au=Carreira%2C+Maria+J&rft.au=Varas-Quintana%2C+Paulina&rft.au=Balsa-Castro%2C+Carlos&rft.date=2020-07-01&rft.pub=The+Institute+of+Electrical+and+Electronics+Engineers%2C+Inc.+%28IEEE%29&rft.issn=0278-0062&rft.eissn=1558-254X&rft.volume=39&rft.issue=7&rft.spage=2374&rft_id=info:doi/10.1109%2FTMI.2020.2968765&rft.externalDBID=NO_FULL_TEXT |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0278-0062&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0278-0062&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0278-0062&client=summon |