Are Face Detection Models Biased?
The presence of bias in deep models leads to unfair outcomes for certain demographic subgroups. Research in bias focuses primarily on facial recognition and attribute prediction with scarce emphasis on face detection. Existing studies consider face detection as binary classification into 'face&...
Saved in:
Main Authors | , , , , |
---|---|
Format | Journal Article |
Language | English |
Published |
07.11.2022
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | The presence of bias in deep models leads to unfair outcomes for certain
demographic subgroups. Research in bias focuses primarily on facial recognition
and attribute prediction with scarce emphasis on face detection. Existing
studies consider face detection as binary classification into 'face' and
'non-face' classes. In this work, we investigate possible bias in the domain of
face detection through facial region localization which is currently
unexplored. Since facial region localization is an essential task for all face
recognition pipelines, it is imperative to analyze the presence of such bias in
popular deep models. Most existing face detection datasets lack suitable
annotation for such analysis. Therefore, we web-curate the Fair Face
Localization with Attributes (F2LA) dataset and manually annotate more than 10
attributes per face, including facial localization information. Utilizing the
extensive annotations from F2LA, an experimental setup is designed to study the
performance of four pre-trained face detectors. We observe (i) a high disparity
in detection accuracies across gender and skin-tone, and (ii) interplay of
confounding factors beyond demography. The F2LA data and associated annotations
can be accessed at http://iab-rubric.org/index.php/F2LA. |
---|---|
AbstractList | The presence of bias in deep models leads to unfair outcomes for certain
demographic subgroups. Research in bias focuses primarily on facial recognition
and attribute prediction with scarce emphasis on face detection. Existing
studies consider face detection as binary classification into 'face' and
'non-face' classes. In this work, we investigate possible bias in the domain of
face detection through facial region localization which is currently
unexplored. Since facial region localization is an essential task for all face
recognition pipelines, it is imperative to analyze the presence of such bias in
popular deep models. Most existing face detection datasets lack suitable
annotation for such analysis. Therefore, we web-curate the Fair Face
Localization with Attributes (F2LA) dataset and manually annotate more than 10
attributes per face, including facial localization information. Utilizing the
extensive annotations from F2LA, an experimental setup is designed to study the
performance of four pre-trained face detectors. We observe (i) a high disparity
in detection accuracies across gender and skin-tone, and (ii) interplay of
confounding factors beyond demography. The F2LA data and associated annotations
can be accessed at http://iab-rubric.org/index.php/F2LA. |
Author | Majumdar, Puspita Singh, Richa Vatsa, Mayank Thakral, Kartik Mittal, Surbhi |
Author_xml | – sequence: 1 givenname: Surbhi surname: Mittal fullname: Mittal, Surbhi – sequence: 2 givenname: Kartik surname: Thakral fullname: Thakral, Kartik – sequence: 3 givenname: Puspita surname: Majumdar fullname: Majumdar, Puspita – sequence: 4 givenname: Mayank surname: Vatsa fullname: Vatsa, Mayank – sequence: 5 givenname: Richa surname: Singh fullname: Singh, Richa |
BackLink | https://doi.org/10.48550/arXiv.2211.03588$$DView paper in arXiv |
BookMark | eNotzrtuwjAUgGEPZWiBB-jU9AES7BwfX6aKa4sEYmGPju0TKRJNqgRV9O0rLtO__fpexFPbtSzEq5KFdohyRv2l-S3KUqlCAjr3LN7nPWcbipyt-Mzx3HRttu8Sn4Zs0dDA6WMiRjWdBp4-OhbHzfq4_Mp3h8_tcr7LyViXKzABZbDGI3kJjhRrRTHpUAdPFrQJUJYpKvDeeuQaDaoUJRstEWyEsXi7b2_G6qdvvqn_q67W6maFf4dAOGc |
ContentType | Journal Article |
Copyright | http://creativecommons.org/licenses/by/4.0 |
Copyright_xml | – notice: http://creativecommons.org/licenses/by/4.0 |
DBID | AKY GOX |
DOI | 10.48550/arxiv.2211.03588 |
DatabaseName | arXiv Computer Science arXiv.org |
DatabaseTitleList | |
Database_xml | – sequence: 1 dbid: GOX name: arXiv.org url: http://arxiv.org/find sourceTypes: Open Access Repository |
DeliveryMethod | fulltext_linktorsrc |
ExternalDocumentID | 2211_03588 |
GroupedDBID | AKY GOX |
ID | FETCH-LOGICAL-a678-136b50b7695a9038a1e41acd4bfb9a7346b322dc1399795ef5651dc0e640537c3 |
IEDL.DBID | GOX |
IngestDate | Mon Jan 08 05:44:36 EST 2024 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | false |
IsScholarly | false |
Language | English |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-a678-136b50b7695a9038a1e41acd4bfb9a7346b322dc1399795ef5651dc0e640537c3 |
OpenAccessLink | https://arxiv.org/abs/2211.03588 |
ParticipantIDs | arxiv_primary_2211_03588 |
PublicationCentury | 2000 |
PublicationDate | 2022-11-07 |
PublicationDateYYYYMMDD | 2022-11-07 |
PublicationDate_xml | – month: 11 year: 2022 text: 2022-11-07 day: 07 |
PublicationDecade | 2020 |
PublicationYear | 2022 |
Score | 1.8594842 |
SecondaryResourceType | preprint |
Snippet | The presence of bias in deep models leads to unfair outcomes for certain
demographic subgroups. Research in bias focuses primarily on facial recognition
and... |
SourceID | arxiv |
SourceType | Open Access Repository |
SubjectTerms | Computer Science - Computer Vision and Pattern Recognition |
Title | Are Face Detection Models Biased? |
URI | https://arxiv.org/abs/2211.03588 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwdV1LSwMxEA5tT15EUalPIngN7ua1yUnqYy2CeqmwtyWTBxSkyLYWf75JdkUvXpO5TEIy35fMfIPQFbdANZSSuNJYwoOTBBRIYrRgQksV7-VEFJ9f5PyNPzWiGSH8Uwtjuq_lttcHhvU1pUlhkwmlxmhMaUrZenxt-s_JLMU12P_aRYyZh_4EiXoP7Q7oDs_67dhHI786QJezzuPaWI_v_SZnPq1wakH2vsa3yxhE3M0hWtQPi7s5GToTECMT62ISRAGV1MLogilTeh5ddRwCaFMxLiGeE2cjutKVFj5E1FQ6W3jJk3yKZUdoEsm9n6bMouC1N1wGCNxEruQDj2deSUgVqJwdo2n2p_3oxSfa5GqbXT35f-oU7dCUpp-eP6szNNl0n_48Bs8NXOQV_AYZPGx9 |
link.rule.ids | 228,230,783,888 |
linkProvider | Cornell University |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Are+Face+Detection+Models+Biased%3F&rft.au=Mittal%2C+Surbhi&rft.au=Thakral%2C+Kartik&rft.au=Majumdar%2C+Puspita&rft.au=Vatsa%2C+Mayank&rft.date=2022-11-07&rft_id=info:doi/10.48550%2Farxiv.2211.03588&rft.externalDocID=2211_03588 |