Adaptively Weighted k-Tuple Metric Network for Kinship Verification
Facial image-based kinship verification is a rapidly growing field in computer vision and biometrics. The key to determining whether a pair of facial images has a kin relation is to train a model that can enlarge the margin between the faces that have no kin relation while reducing the distance betw...
Saved in:
Published in | IEEE transactions on cybernetics Vol. 53; no. 10; pp. 6173 - 6186 |
---|---|
Main Authors | , , , , , |
Format | Journal Article |
Language | English |
Published |
United States
IEEE
01.10.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | Facial image-based kinship verification is a rapidly growing field in computer vision and biometrics. The key to determining whether a pair of facial images has a kin relation is to train a model that can enlarge the margin between the faces that have no kin relation while reducing the distance between faces that have a kin relation. Most existing approaches primarily exploit duplet (i.e., two input samples without cross pair) or triplet (i.e., single negative pair for each positive pair with low-order cross pair) information, omitting discriminative features from multiple negative pairs. These approaches suffer from weak generalizability, resulting in unsatisfactory performance. Inspired by human visual systems that incorporate both low-order and high-order cross-pair information from local and global perspectives, we propose to leverage high-order cross-pair features and develop a novel end-to-end deep learning model called the adaptively weighted <inline-formula> <tex-math notation="LaTeX">k </tex-math></inline-formula>-tuple metric network (AW<inline-formula> <tex-math notation="LaTeX">k </tex-math></inline-formula>-TMN). Our main contributions are three-fold. First, a novel cross-pair metric learning loss based on <inline-formula> <tex-math notation="LaTeX">k </tex-math></inline-formula>-tuplet loss is introduced. It naturally captures both the low-order and high-order discriminative features from multiple negative pairs. Second, an adaptively weighted scheme is formulated to better highlight hard negative examples among multiple negative pairs, leading to enhanced performance. Third, the model utilizes multiple levels of convolutional features and jointly optimizes feature and metric learning to further exploit the low-order and high-order representational power. Extensive experimental results on three popular kinship verification datasets demonstrate the effectiveness of our proposed AW<inline-formula> <tex-math notation="LaTeX">k </tex-math></inline-formula>-TMN approach compared with several state-of-the-art approaches. The source codes and models are released.<xref ref-type="fn" rid="fn1">1 |
---|---|
AbstractList | Facial image-based kinship verification is a rapidly growing field in computer vision and biometrics. The key to determining whether a pair of facial images has a kin relation is to train a model that can enlarge the margin between the faces that have no kin relation while reducing the distance between faces that have a kin relation. Most existing approaches primarily exploit duplet (i.e., two input samples without cross pair) or triplet (i.e., single negative pair for each positive pair with low-order cross pair) information, omitting discriminative features from multiple negative pairs. These approaches suffer from weak generalizability, resulting in unsatisfactory performance. Inspired by human visual systems that incorporate both low-order and high-order cross-pair information from local and global perspectives, we propose to leverage high-order cross-pair features and develop a novel end-to-end deep learning model called the adaptively weighted <inline-formula> <tex-math notation="LaTeX">k </tex-math></inline-formula>-tuple metric network (AW<inline-formula> <tex-math notation="LaTeX">k </tex-math></inline-formula>-TMN). Our main contributions are three-fold. First, a novel cross-pair metric learning loss based on <inline-formula> <tex-math notation="LaTeX">k </tex-math></inline-formula>-tuplet loss is introduced. It naturally captures both the low-order and high-order discriminative features from multiple negative pairs. Second, an adaptively weighted scheme is formulated to better highlight hard negative examples among multiple negative pairs, leading to enhanced performance. Third, the model utilizes multiple levels of convolutional features and jointly optimizes feature and metric learning to further exploit the low-order and high-order representational power. Extensive experimental results on three popular kinship verification datasets demonstrate the effectiveness of our proposed AW<inline-formula> <tex-math notation="LaTeX">k </tex-math></inline-formula>-TMN approach compared with several state-of-the-art approaches. The source codes and models are released.<xref ref-type="fn" rid="fn1">1 Facial image-based kinship verification is a rapidly growing field in computer vision and biometrics. The key to determining whether a pair of facial images has a kin relation is to train a model that can enlarge the margin between the faces that have no kin relation while reducing the distance between faces that have a kin relation. Most existing approaches primarily exploit duplet (i.e., two input samples without cross pair) or triplet (i.e., single negative pair for each positive pair with low-order cross pair) information, omitting discriminative features from multiple negative pairs. These approaches suffer from weak generalizability, resulting in unsatisfactory performance. Inspired by human visual systems that incorporate both low-order and high-order cross-pair information from local and global perspectives, we propose to leverage high-order cross-pair features and develop a novel end-to-end deep learning model called the adaptively weighted k-tuple metric network (AWk-TMN). Our main contributions are three-fold. First, a novel cross-pair metric learning loss based on k-tuplet loss is introduced. It naturally captures both the low-order and high-order discriminative features from multiple negative pairs. Second, an adaptively weighted scheme is formulated to better highlight hard negative examples among multiple negative pairs, leading to enhanced performance. Third, the model utilizes multiple levels of convolutional features and jointly optimizes feature and metric learning to further exploit the low-order and high-order representational power. Extensive experimental results on three popular kinship verification datasets demonstrate the effectiveness of our proposed AWk-TMN approach compared with several state-of-the-art approaches. The source codes and models are released.1. Facial image-based kinship verification is a rapidly growing field in computer vision and biometrics. The key to determining whether a pair of facial images has a kin relation is to train a model that can enlarge the margin between the faces that have no kin relation while reducing the distance between faces that have a kin relation. Most existing approaches primarily exploit duplet (i.e., two input samples without cross pair) or triplet (i.e., single negative pair for each positive pair with low-order cross pair) information, omitting discriminative features from multiple negative pairs. These approaches suffer from weak generalizability, resulting in unsatisfactory performance. Inspired by human visual systems that incorporate both low-order and high-order cross-pair information from local and global perspectives, we propose to leverage high-order cross-pair features and develop a novel end-to-end deep learning model called the adaptively weighted k -tuple metric network (AW k -TMN). Our main contributions are three-fold. First, a novel cross-pair metric learning loss based on k -tuplet loss is introduced. It naturally captures both the low-order and high-order discriminative features from multiple negative pairs. Second, an adaptively weighted scheme is formulated to better highlight hard negative examples among multiple negative pairs, leading to enhanced performance. Third, the model utilizes multiple levels of convolutional features and jointly optimizes feature and metric learning to further exploit the low-order and high-order representational power. Extensive experimental results on three popular kinship verification datasets demonstrate the effectiveness of our proposed AW k -TMN approach compared with several state-of-the-art approaches. The source codes and models are released.1.Facial image-based kinship verification is a rapidly growing field in computer vision and biometrics. The key to determining whether a pair of facial images has a kin relation is to train a model that can enlarge the margin between the faces that have no kin relation while reducing the distance between faces that have a kin relation. Most existing approaches primarily exploit duplet (i.e., two input samples without cross pair) or triplet (i.e., single negative pair for each positive pair with low-order cross pair) information, omitting discriminative features from multiple negative pairs. These approaches suffer from weak generalizability, resulting in unsatisfactory performance. Inspired by human visual systems that incorporate both low-order and high-order cross-pair information from local and global perspectives, we propose to leverage high-order cross-pair features and develop a novel end-to-end deep learning model called the adaptively weighted k -tuple metric network (AW k -TMN). Our main contributions are three-fold. First, a novel cross-pair metric learning loss based on k -tuplet loss is introduced. It naturally captures both the low-order and high-order discriminative features from multiple negative pairs. Second, an adaptively weighted scheme is formulated to better highlight hard negative examples among multiple negative pairs, leading to enhanced performance. Third, the model utilizes multiple levels of convolutional features and jointly optimizes feature and metric learning to further exploit the low-order and high-order representational power. Extensive experimental results on three popular kinship verification datasets demonstrate the effectiveness of our proposed AW k -TMN approach compared with several state-of-the-art approaches. The source codes and models are released.1. Facial image-based kinship verification is a rapidly growing field in computer vision and biometrics. The key to determining whether a pair of facial images has a kin relation is to train a model that can enlarge the margin between the faces that have no kin relation while reducing the distance between faces that have a kin relation. Most existing approaches primarily exploit duplet (i.e., two input samples without cross pair) or triplet (i.e., single negative pair for each positive pair with low-order cross pair) information, omitting discriminative features from multiple negative pairs. These approaches suffer from weak generalizability, resulting in unsatisfactory performance. Inspired by human visual systems that incorporate both low-order and high-order cross-pair information from local and global perspectives, we propose to leverage high-order cross-pair features and develop a novel end-to-end deep learning model called the adaptively weighted [Formula Omitted]-tuple metric network (AW[Formula Omitted]-TMN). Our main contributions are three-fold. First, a novel cross-pair metric learning loss based on [Formula Omitted]-tuplet loss is introduced. It naturally captures both the low-order and high-order discriminative features from multiple negative pairs. Second, an adaptively weighted scheme is formulated to better highlight hard negative examples among multiple negative pairs, leading to enhanced performance. Third, the model utilizes multiple levels of convolutional features and jointly optimizes feature and metric learning to further exploit the low-order and high-order representational power. Extensive experimental results on three popular kinship verification datasets demonstrate the effectiveness of our proposed AW[Formula Omitted]-TMN approach compared with several state-of-the-art approaches. The source codes and models are released. 1 |
Author | Huang, Sheng Xing, Yun Huangfu, Luwen Hu, Junlin Zeng, Daniel Dajun Lin, Jingkai |
Author_xml | – sequence: 1 givenname: Sheng orcidid: 0000-0001-5610-0826 surname: Huang fullname: Huang, Sheng email: huangsheng@cqu.edu.cn organization: Key Laboratory of Dependable Service Computing in Cyber Physical Society, Ministry of Education, and the School of Big Data and Software Engineering, Chongqing University, Chongqing, China – sequence: 2 givenname: Jingkai surname: Lin fullname: Lin, Jingkai email: linjingkai@cqu.edu.cn organization: School of Big Data and Software Engineering, Chongqing University, Chongqing, China – sequence: 3 givenname: Luwen orcidid: 0000-0003-3926-7901 surname: Huangfu fullname: Huangfu, Luwen email: lhuangfu@sdsu.edu organization: Fowler College of Business and the Center for Human Dynamics in the Mobile Age, San Diego State University, San Diego, CA, USA – sequence: 4 givenname: Yun surname: Xing fullname: Xing, Yun email: yxing@cqu.edu.cn organization: School of Big Data and Software Engineering, Chongqing University, Chongqing, China – sequence: 5 givenname: Junlin orcidid: 0000-0002-0117-3494 surname: Hu fullname: Hu, Junlin email: hujunlin@buaa.edu.cn organization: School of Software, Beihang University, Beijing, China – sequence: 6 givenname: Daniel Dajun surname: Zeng fullname: Zeng, Daniel Dajun email: dajun.zeng@ia.ac.cn organization: Institute of Automation, Chinese Academy of Sciences, Beijing, China |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/35439158$$D View this record in MEDLINE/PubMed |
BookMark | eNp9kU1PGzEQhq0KVEKaH1AhVStx4bLBH7v2-ggRtAhoL2mrnqy1PQGHze5ie6n493VIwiGHzmVGo-cdzcx7jA7argWEPhM8JQTL8_nsz-WUYkqnjHAmsPiARpTwKqdUlAfvNRdHaBLCEqeoUktWH9ERKwsmSVmN0OzC1n10L9C8Zr_BPTxGsNlTPh_6BrJ7iN6Z7DvEv51_yhadz25dGx5dn_0C7xbO1NF17Sd0uKibAJNtHqOf11fz2bf87sfXm9nFXW5YIWNeF9oWhBNjGOelxRpXkppSg2ZgwAIRzEpdayoNYOCFlYIzjTUrSkq0NWyMzjZze989DxCiWrlgoGnqFrohKMpLWvFCViKhp3vosht8m7ZTCSmxpJhUifqypQa9Aqt671a1f1W79yRAbADjuxA8LJRx8e3m6GvXKILV2gu19kKtvVBbL5KS7Cl3w_-nOdloHAC88-kLmArK_gGB9pLd |
CODEN | ITCEB8 |
CitedBy_id | crossref_primary_10_1007_s11042_024_18879_5 crossref_primary_10_1007_s00371_024_03318_1 crossref_primary_10_1016_j_eswa_2024_124815 crossref_primary_10_1002_elps_202300169 crossref_primary_10_1007_s00371_024_03493_1 |
Cites_doi | 10.1007/s11042-018-6517-0 10.1142/S0218001415560017 10.1016/j.neucom.2019.10.055 10.1145/2964284.2967219 10.1109/CVPR.2018.00231 10.1016/j.neucom.2019.09.089 10.1007/s10489-019-01489-2 10.1016/j.patcog.2020.107342 10.1109/WIFS.2016.7823901 10.5244/C.29.148 10.1109/FG.2011.5771466 10.1109/ICCV.2019.00140 10.1145/2393347.2396297 10.1109/MMUL.2012.4 10.1109/CVPR.2015.7298682 10.1109/ICCV.2019.00659 10.1109/ICIP.2016.7532894 10.1016/j.ins.2017.11.048 10.1016/j.patcog.2020.107541 10.1016/j.patrec.2018.05.027 10.1007/978-3-030-01261-8_28 10.1109/CVPR.2017.145 10.1145/3134421.3134425 10.1016/j.inffus.2018.07.011 10.1109/CVPRW.2014.8 10.1109/CVPR.2018.00131 10.1109/TIFS.2014.2327757 10.1109/SIPROCESS.2018.8600423 10.1109/CVPR.2014.220 10.1007/978-3-030-01231-1_17 10.1007/s00371-013-0884-3 10.1109/FG47880.2020.00127 10.1109/TCYB.2019.2959403 10.1109/TPAMI.2013.134 10.1167/6.12.2 10.1109/ICCV.2019.00109 10.1109/ICME46284.2020.9102823 10.1109/TPAMI.2012.70 10.1109/TIP.2016.2605922 10.1007/978-981-10-7299-4_47 10.1109/CVPR.2016.90 10.1109/ICME46284.2020.9102891 10.1016/j.visres.2008.09.025 10.1109/TIP.2017.2717505 10.1109/TCYB.2014.2376934 10.1109/WACV.2016.7477557 10.1109/BTAS.2015.7358768 10.1109/TMM.2015.2461462 10.1016/j.patrec.2020.06.019 10.1109/CVPR.2014.242 10.1109/TAFFC.2018.2828819 10.1109/BTAS.2012.6374564 10.1109/TIP.2018.2875346 10.1109/TIP.2019.2959253 10.1016/j.inffus.2015.08.006 10.1109/TMM.2020.2972125 10.1109/TPAMI.2019.2906588 10.1109/CVPR.2017.243 10.1109/CVPR.2014.227 10.1109/FG.2019.8756528 10.1109/CVPR.2019.00860 10.1109/ICCV.2017.31 10.1109/ICPR.2014.735 10.1109/FG.2015.7284834 10.1007/s00371-017-1442-1 10.1109/TCSVT.2017.2691801 10.1109/TCYB.2017.2739338 |
ContentType | Journal Article |
Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023 |
Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023 |
DBID | 97E RIA RIE AAYXX CITATION NPM 7SC 7SP 7TB 8FD F28 FR3 H8D JQ2 L7M L~C L~D 7X8 |
DOI | 10.1109/TCYB.2022.3163707 |
DatabaseName | IEEE All-Society Periodicals Package (ASPP) 2005–Present IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Electronic Library (IEL) - NZ CrossRef PubMed Computer and Information Systems Abstracts Electronics & Communications Abstracts Mechanical & Transportation Engineering Abstracts Technology Research Database ANTE: Abstracts in New Technology & Engineering Engineering Research Database Aerospace Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional MEDLINE - Academic |
DatabaseTitle | CrossRef PubMed Aerospace Database Technology Research Database Computer and Information Systems Abstracts – Academic Mechanical & Transportation Engineering Abstracts Electronics & Communications Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Engineering Research Database Advanced Technologies Database with Aerospace ANTE: Abstracts in New Technology & Engineering Computer and Information Systems Abstracts Professional MEDLINE - Academic |
DatabaseTitleList | PubMed MEDLINE - Academic Aerospace Database |
Database_xml | – sequence: 1 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 2 dbid: RIE name: IEEE/IET Electronic Library url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/ sourceTypes: Publisher |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Sciences (General) |
EISSN | 2168-2275 |
EndPage | 6186 |
ExternalDocumentID | 35439158 10_1109_TCYB_2022_3163707 9760272 |
Genre | orig-research Journal Article |
GrantInformation_xml | – fundername: National Natural Science Foundation of China grantid: 62176030 funderid: 10.13039/501100001809 – fundername: Natural Science Foundation of Chongqing grantid: cstc2021jcyj-msxmX0568 funderid: 10.13039/501100005230 |
GroupedDBID | 0R~ 4.4 6IK 97E AAJGR AARMG AASAJ AAWTH ABAZT ABQJQ ABVLG ACIWK AENEX AGQYO AGSQL AHBIQ AKJIK AKQYR ALMA_UNASSIGNED_HOLDINGS ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ EBS EJD HZ~ IFIPE IPLJI JAVBF M43 O9- OCL PQQKQ RIA RIE RNS AAYXX CITATION RIG NPM 7SC 7SP 7TB 8FD F28 FR3 H8D JQ2 L7M L~C L~D 7X8 |
ID | FETCH-LOGICAL-c349t-a4bd4161cc3665d0b0892c5beb3ecede173d9bab29ce0e64d9763b0b34521bdc3 |
IEDL.DBID | RIE |
ISSN | 2168-2267 2168-2275 |
IngestDate | Fri Jul 11 11:53:17 EDT 2025 Mon Jun 30 05:56:54 EDT 2025 Thu Jan 02 22:54:03 EST 2025 Thu Apr 24 23:06:10 EDT 2025 Tue Jul 01 05:12:33 EDT 2025 Wed Aug 27 02:04:22 EDT 2025 |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 10 |
Language | English |
License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html https://doi.org/10.15223/policy-029 https://doi.org/10.15223/policy-037 |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c349t-a4bd4161cc3665d0b0892c5beb3ecede173d9bab29ce0e64d9763b0b34521bdc3 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
ORCID | 0000-0002-0117-3494 0000-0003-3926-7901 0000-0001-5610-0826 |
PMID | 35439158 |
PQID | 2865092018 |
PQPubID | 85422 |
PageCount | 14 |
ParticipantIDs | pubmed_primary_35439158 crossref_citationtrail_10_1109_TCYB_2022_3163707 ieee_primary_9760272 proquest_miscellaneous_2652864987 proquest_journals_2865092018 crossref_primary_10_1109_TCYB_2022_3163707 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2023-10-01 |
PublicationDateYYYYMMDD | 2023-10-01 |
PublicationDate_xml | – month: 10 year: 2023 text: 2023-10-01 day: 01 |
PublicationDecade | 2020 |
PublicationPlace | United States |
PublicationPlace_xml | – name: United States – name: Piscataway |
PublicationTitle | IEEE transactions on cybernetics |
PublicationTitleAbbrev | TCYB |
PublicationTitleAlternate | IEEE Trans Cybern |
PublicationYear | 2023 |
Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
References | ref57 ref12 ref56 ref15 ref59 ref14 ref58 sohn (ref33) 2016 ref53 ref52 ref11 ref55 ref10 ref54 ref17 ref16 ref19 ref18 lu (ref25) 2014; 36 lu (ref62) 2015; 1 ref51 ref50 ref46 ref45 kumar (ref27) 2016; 4 ref48 ref47 ref42 ref41 ref44 ref43 ref49 ref8 ref7 ref9 ref4 ref3 hermans (ref40) 2017 ref6 deb (ref13) 2019 zhou (ref5) 2020 ref35 ref34 ref37 ref36 ref31 ref30 ref74 ref32 ref2 ref1 ref39 ref38 hu (ref26) 2014 ref71 ref70 ref73 ref72 ref24 ref68 ref23 ref67 ref69 ref20 ref64 ref63 ref22 ref66 ref21 ref65 ref28 ref29 ref60 ref61 |
References_xml | – ident: ref47 doi: 10.1007/s11042-018-6517-0 – ident: ref43 doi: 10.1142/S0218001415560017 – year: 2020 ident: ref5 article-title: Facial depression recognition by deep joint label distribution and metric learning publication-title: IEEE Trans Affective Comput – ident: ref68 doi: 10.1016/j.neucom.2019.10.055 – ident: ref56 doi: 10.1145/2964284.2967219 – ident: ref10 doi: 10.1109/CVPR.2018.00231 – ident: ref17 doi: 10.1016/j.neucom.2019.09.089 – ident: ref66 doi: 10.1007/s10489-019-01489-2 – ident: ref67 doi: 10.1016/j.patcog.2020.107342 – ident: ref63 doi: 10.1109/WIFS.2016.7823901 – ident: ref44 doi: 10.5244/C.29.148 – ident: ref3 doi: 10.1109/FG.2011.5771466 – ident: ref61 doi: 10.1109/ICCV.2019.00140 – ident: ref41 doi: 10.1145/2393347.2396297 – ident: ref16 doi: 10.1109/MMUL.2012.4 – ident: ref39 doi: 10.1109/CVPR.2015.7298682 – ident: ref59 doi: 10.1109/ICCV.2019.00659 – ident: ref22 doi: 10.1109/ICIP.2016.7532894 – ident: ref65 doi: 10.1016/j.ins.2017.11.048 – ident: ref49 doi: 10.1016/j.patcog.2020.107541 – ident: ref69 doi: 10.1016/j.patrec.2018.05.027 – ident: ref55 doi: 10.1007/978-3-030-01261-8_28 – ident: ref32 doi: 10.1109/CVPR.2017.145 – ident: ref58 doi: 10.1145/3134421.3134425 – ident: ref31 doi: 10.1016/j.inffus.2018.07.011 – ident: ref4 doi: 10.1109/CVPRW.2014.8 – start-page: 252 year: 2014 ident: ref26 article-title: Large margin multi-metric learning for face and kinship verification in the wild publication-title: Proc Asian Conf Comput Vis – ident: ref34 doi: 10.1109/CVPR.2018.00131 – ident: ref52 doi: 10.1109/TIFS.2014.2327757 – ident: ref64 doi: 10.1109/SIPROCESS.2018.8600423 – ident: ref8 doi: 10.1109/CVPR.2014.220 – ident: ref54 doi: 10.1007/978-3-030-01231-1_17 – ident: ref46 doi: 10.1007/s00371-013-0884-3 – ident: ref57 doi: 10.1109/FG47880.2020.00127 – ident: ref18 doi: 10.1109/TCYB.2019.2959403 – year: 2017 ident: ref40 article-title: In defense of the triplet loss for person re-identification publication-title: arXiv 1703 07737 – volume: 36 start-page: 331 year: 2014 ident: ref25 article-title: Neighborhood repulsed metric learning for kinship verification publication-title: IEEE Trans Pattern Anal Mach Intell doi: 10.1109/TPAMI.2013.134 – ident: ref11 doi: 10.1167/6.12.2 – ident: ref36 doi: 10.1109/ICCV.2019.00109 – ident: ref72 doi: 10.1109/ICME46284.2020.9102823 – ident: ref1 doi: 10.1109/TPAMI.2012.70 – ident: ref21 doi: 10.1109/TIP.2016.2605922 – ident: ref23 doi: 10.1007/978-981-10-7299-4_47 – ident: ref60 doi: 10.1109/CVPR.2016.90 – ident: ref71 doi: 10.1109/ICME46284.2020.9102891 – ident: ref12 doi: 10.1016/j.visres.2008.09.025 – ident: ref29 doi: 10.1109/TIP.2017.2717505 – year: 2019 ident: ref13 article-title: Finding missing children: Aging deep face features publication-title: arXiv 1911 07538 – ident: ref19 doi: 10.1109/TCYB.2014.2376934 – ident: ref9 doi: 10.1109/WACV.2016.7477557 – ident: ref20 doi: 10.1109/BTAS.2015.7358768 – ident: ref14 doi: 10.1109/TMM.2015.2461462 – ident: ref70 doi: 10.1016/j.patrec.2020.06.019 – ident: ref2 doi: 10.1109/CVPR.2014.242 – ident: ref6 doi: 10.1109/TAFFC.2018.2828819 – ident: ref50 doi: 10.1109/BTAS.2012.6374564 – ident: ref30 doi: 10.1109/TIP.2018.2875346 – ident: ref37 doi: 10.1109/TIP.2019.2959253 – ident: ref51 doi: 10.1016/j.inffus.2015.08.006 – ident: ref74 doi: 10.1109/TMM.2020.2972125 – ident: ref48 doi: 10.1109/TPAMI.2019.2906588 – ident: ref73 doi: 10.1109/CVPR.2017.243 – start-page: 1857 year: 2016 ident: ref33 article-title: Improved deep metric learning with multi-class N-pair loss objective publication-title: Advances in neural information processing systems – ident: ref45 doi: 10.1109/CVPR.2014.227 – volume: 4 start-page: 29 year: 2016 ident: ref27 article-title: Harmonic rule for measuring the facial similarities among relatives publication-title: Trans Mach Learn Artif Intell – ident: ref53 doi: 10.1109/FG.2019.8756528 – ident: ref35 doi: 10.1109/CVPR.2019.00860 – ident: ref38 doi: 10.1109/ICCV.2017.31 – ident: ref15 doi: 10.1109/ICPR.2014.735 – ident: ref42 doi: 10.1109/FG.2015.7284834 – ident: ref24 doi: 10.1007/s00371-017-1442-1 – ident: ref28 doi: 10.1109/TCSVT.2017.2691801 – ident: ref7 doi: 10.1109/TCYB.2017.2739338 – volume: 1 start-page: 1 year: 2015 ident: ref62 article-title: The FG 2015 kinship verification in the wild evaluation publication-title: Proc IEEE Int Conf Workshops Autom Face Gest Recognit |
SSID | ssj0000816898 |
Score | 2.3884907 |
Snippet | Facial image-based kinship verification is a rapidly growing field in computer vision and biometrics. The key to determining whether a pair of facial images... |
SourceID | proquest pubmed crossref ieee |
SourceType | Aggregation Database Index Database Enrichment Source Publisher |
StartPage | 6173 |
SubjectTerms | Computer vision Convolutional neural networks Deep learning Faces Feature extraction Genetics Human performance kinship verification Measurement metric learning Performance enhancement relation network (RN) Task analysis triplet loss Verification |
Title | Adaptively Weighted k-Tuple Metric Network for Kinship Verification |
URI | https://ieeexplore.ieee.org/document/9760272 https://www.ncbi.nlm.nih.gov/pubmed/35439158 https://www.proquest.com/docview/2865092018 https://www.proquest.com/docview/2652864987 |
Volume | 53 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LTxsxELaAUy9QSoEUWrkSB4rq4Ky9Dx9p1ChqFU6hpaeVHxMJESURZA_013fG66wq1Fa9rbSzXj_G9jf2zHyMnSlnfCilFISdhbZmIJwpjADEDk5XdmCBApwn18X4Rn-5zW-32McuFgYAovMZ9Okx3uWHpW_oqOwSt060onDB3UbDrY3V6s5TIoFEpL7N8EEgqijTJeZAmsvp8McnNAazDG3UQpWSuPdUTlGnxPX-244UKVb-jjbjrjPaY5NNfVtnk_t-s3Z9__NZKsf_bdBLtpvgJ79q9WWfbcHiFdtPE_yRn6cs1B8O2PAq2BWthfMn_j2en0Lg92LarObAJ8TD5fl160POEfjyr3fR8Yt_Q5WepYPA1-xm9Hk6HIvEuCC80mYtrHaBLB7vVVHkQTpZmcznDi1u8BBgUKpgnHWZ8SCh0AEboJx0SiMKcMGrQ7azWC7gmPEZIscCSlABjJ5ZZ30J2hlTqqoM1lc9Jje9XvuUjpxYMeZ1NEukqWnMahqzOo1Zj110n6zaXBz_Ej6g_u4EU1f32OlmaOs0Wx9ris6VBqEQ1up99xrnGV2e2AUsG5QpchTTpsKSj1qV6MreaNKbP__zhL0gkvrWBfCU7awfGniLUGbt3kUd_gVCa-wu |
linkProvider | IEEE |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LbxMxEB5V7YFegNICKQWM1ENBOHXW3oePJaIKtMkphXJa-TGRUKMkotlD--sZe50VQoC4rbSzXj_Gnm_sGX8Ax9Jq50sheMDOXBk94FYXmiNhB6sqMzAYEpzHk2J0pT5f59db8L7LhUHEGHyG_fAYz_L90jVhq-yUTCd5UbTg7pDdz7M2W6vbUYkUEpH8NqMHTriiTMeYA6FPp8NvH8gdzDLyUgtZisC-J_OQdxrY3n-xSZFk5e94M9qd80cw3tS4DTe56Tdr23f3v13m-L9NegwPEwBlZ63G7MEWLp7AXprit-wk3UP9dh-GZ96swmo4v2Nf4w4qenbDp81qjmwcmLgcm7RR5IygL7v4HkO_2BdS6lnaCjyAq_OP0-GIJ84F7qTSa26U9cHncU4WRe6FFZXOXG7J50aHHgel9Noam2mHAgvlqQHSCisV4QDrnXwK24vlAp8DmxF2LLBE6VGrmbHGlais1qWsSm9c1QOx6fXapQvJAy_GvI6OidB1GLM6jFmdxqwH77pPVu1tHP8S3g_93Qmmru7B0WZo6zRfb-uQnys0gSGq1ZvuNc20cHxiFrhsSKbISUzpikp-1qpEV_ZGkw7__M_X8GA0HV_Wl58mFy9gN1DWtwGBR7C9_tHgSwI2a_sq6vNPwBzveA |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Adaptively+Weighted+k-Tuple+Metric+Network+for+Kinship+Verification&rft.jtitle=IEEE+transactions+on+cybernetics&rft.au=Huang%2C+Sheng&rft.au=Lin%2C+Jingkai&rft.au=Huangfu%2C+Luwen&rft.au=Xing%2C+Yun&rft.date=2023-10-01&rft.pub=IEEE&rft.issn=2168-2267&rft.volume=53&rft.issue=10&rft.spage=6173&rft.epage=6186&rft_id=info:doi/10.1109%2FTCYB.2022.3163707&rft_id=info%3Apmid%2F35439158&rft.externalDocID=9760272 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2168-2267&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2168-2267&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2168-2267&client=summon |