Exclusive Feature Constrained Class Activation Mapping for Better Visual Explanation
Whereas Deep Neural Network(DNN) shows wonderful performance on large scale data, lacking interpretability limits their usage in scenarios relevant to security. To make visual explanations less noisy and more class-discriminative, in this work, we propose a visual explanation method of DNN, named Ex...
Saved in:
Published in | IEEE access Vol. 9; pp. 61417 - 61428 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
Piscataway
IEEE
2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | Whereas Deep Neural Network(DNN) shows wonderful performance on large scale data, lacking interpretability limits their usage in scenarios relevant to security. To make visual explanations less noisy and more class-discriminative, in this work, we propose a visual explanation method of DNN, named Exclusive Feature Constrained Class Activation Mapping(EFC-CAM). A new exclusive feature constraint is introduced to optimize the weight calculated from Grad-CAM or initialized from a constant vector. To better measure visual explanation methods, we design an effective evaluation metric which does not need bounding boxes as auxiliary information. Extensive quantitative experiments and visual inspection on ImageNet and Fashion validation set show the effectiveness of the proposed method. |
---|---|
AbstractList | Whereas Deep Neural Network(DNN) shows wonderful performance on large scale data, lacking interpretability limits their usage in scenarios relevant to security. To make visual explanations less noisy and more class-discriminative, in this work, we propose a visual explanation method of DNN, named Exclusive Feature Constrained Class Activation Mapping(EFC-CAM). A new exclusive feature constraint is introduced to optimize the weight calculated from Grad-CAM or initialized from a constant vector. To better measure visual explanation methods, we design an effective evaluation metric which does not need bounding boxes as auxiliary information. Extensive quantitative experiments and visual inspection on ImageNet and Fashion validation set show the effectiveness of the proposed method. |
Author | Kong, Xiangwei Guo, Weikuo Zhang, Xunpeng Wang, Pengda |
Author_xml | – sequence: 1 givenname: Pengda orcidid: 0000-0001-6779-574X surname: Wang fullname: Wang, Pengda organization: School of Information and Communication Engineering, Dalian University of Technology, Dalian, China – sequence: 2 givenname: Xiangwei orcidid: 0000-0002-0851-6752 surname: Kong fullname: Kong, Xiangwei email: kongxiangwei@zju.edu.cn organization: School of Management, Zhejiang University, Hangzhou, China – sequence: 3 givenname: Weikuo surname: Guo fullname: Guo, Weikuo organization: School of Information and Communication Engineering, Dalian University of Technology, Dalian, China – sequence: 4 givenname: Xunpeng surname: Zhang fullname: Zhang, Xunpeng organization: School of Information and Communication Engineering, Dalian University of Technology, Dalian, China |
BookMark | eNp9kUtP3DAUha0KpFLgF7CJ1PVM_X4sp9HQIlF1Ae3Wcpwb5FGIU9tB8O-bTCiquqg3tq7Od-71PR_QyRAHQOiK4C0h2Hza1fX-7m5LMSVbhhXjUrxDZ5RIs2GCyZO_3u_RZc4HPB89l4Q6Q_f7Z99POTxBdQ2uTAmqOg65JBcGaKu6dzlXO1_CkyshDtU3N45heKi6mKrPUAqk6mfIk-ur_fPYu-GoukCnneszXL7e5-jH9f6-_rq5_f7lpt7dbjzHumwaxkEqRRtBOsyZAtEIxYAwyonk0HHjtZKc4Ea2VLqGCiw8bbXBiniDNTtHN6tvG93Bjik8uvRiowv2WIjpwbpUgu_BNli7hnPhuTZcUKk702q87AyUJ6SdvT6uXmOKvybIxR7ilIZ5fEsFWboRLGaVWVU-xZwTdNaHcvzzsrHeEmyXTOyaiV387WsmM8v-Yf9M_H_qaqUCALwRhmMhFWW_AZVZl5Y |
CODEN | IAECCG |
CitedBy_id | crossref_primary_10_1007_s11036_022_02021_6 crossref_primary_10_1109_ACCESS_2024_3413859 crossref_primary_10_1007_s10994_023_06373_2 crossref_primary_10_3390_bioengineering10091070 crossref_primary_10_3390_s22176516 crossref_primary_10_3390_make7010012 crossref_primary_10_3390_ai4030033 |
Cites_doi | 10.1109/CVPR.2014.81 10.1109/ICCV.2017.74 10.1145/3219819.3220099 10.1109/ACCESS.2018.2870052 10.1109/CVPR.2016.90 10.1109/CVPR.2018.00920 10.1109/CVPR.2015.7298965 10.1109/ICCV.2017.371 10.1109/CVPR.2017.660 10.1145/2939672.2939778 10.1109/ICCVW.2019.00513 10.1007/s11263-015-0816-y 10.1016/0167-2789(92)90242-F 10.1007/s11263-017-1059-x 10.1145/3359786 10.1109/CVPR.2016.319 10.1109/ICCV.2017.322 10.1109/WACV.2018.00097 10.1371/journal.pone.0130140 10.1007/978-3-030-01234-2_49 |
ContentType | Journal Article |
Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021 |
Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021 |
DBID | 97E ESBDL RIA RIE AAYXX CITATION 7SC 7SP 7SR 8BQ 8FD JG9 JQ2 L7M L~C L~D DOA |
DOI | 10.1109/ACCESS.2021.3073465 |
DatabaseName | IEEE All-Society Periodicals Package (ASPP) 2005–Present IEEE Xplore Open Access Journals IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE/IET Electronic Library CrossRef Computer and Information Systems Abstracts Electronics & Communications Abstracts Engineered Materials Abstracts METADEX Technology Research Database Materials Research Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional DOAJ Directory of Open Access Journals |
DatabaseTitle | CrossRef Materials Research Database Engineered Materials Abstracts Technology Research Database Computer and Information Systems Abstracts – Academic Electronics & Communications Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Advanced Technologies Database with Aerospace METADEX Computer and Information Systems Abstracts Professional |
DatabaseTitleList | Materials Research Database |
Database_xml | – sequence: 1 dbid: DOA name: DOAJ Directory of Open Access Journals url: https://www.doaj.org/ sourceTypes: Open Website – sequence: 2 dbid: RIE name: IEEE Electronic Library (IEL) url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/ sourceTypes: Publisher |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Engineering |
EISSN | 2169-3536 |
EndPage | 61428 |
ExternalDocumentID | oai_doaj_org_article_b08ab445c48945268f9d802021e7c11d 10_1109_ACCESS_2021_3073465 9405672 |
Genre | orig-research |
GrantInformation_xml | – fundername: National Natural Science Foundation of China grantid: 61772111 funderid: 10.13039/501100001809 |
GroupedDBID | 0R~ 4.4 5VS 6IK 97E AAJGR ABAZT ABVLG ACGFS ADBBV AGSQL ALMA_UNASSIGNED_HOLDINGS BCNDV BEFXN BFFAM BGNUA BKEBE BPEOZ EBS EJD ESBDL GROUPED_DOAJ IPLJI JAVBF KQ8 M43 M~E O9- OCL OK1 RIA RIE RNS AAYXX CITATION RIG 7SC 7SP 7SR 8BQ 8FD JG9 JQ2 L7M L~C L~D |
ID | FETCH-LOGICAL-c408t-b34e6772b51f0437e5b573e1324164ef49c876410b6d26ab2505c2d89071c9083 |
IEDL.DBID | DOA |
ISSN | 2169-3536 |
IngestDate | Wed Aug 27 01:20:45 EDT 2025 Sun Jun 29 15:37:16 EDT 2025 Thu Apr 24 23:10:48 EDT 2025 Tue Jul 01 04:03:26 EDT 2025 Wed Aug 27 02:30:03 EDT 2025 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Language | English |
License | https://creativecommons.org/licenses/by/4.0/legalcode |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c408t-b34e6772b51f0437e5b573e1324164ef49c876410b6d26ab2505c2d89071c9083 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ORCID | 0000-0001-6779-574X 0000-0002-0851-6752 |
OpenAccessLink | https://doaj.org/article/b08ab445c48945268f9d802021e7c11d |
PQID | 2519083105 |
PQPubID | 4845423 |
PageCount | 12 |
ParticipantIDs | proquest_journals_2519083105 crossref_citationtrail_10_1109_ACCESS_2021_3073465 ieee_primary_9405672 crossref_primary_10_1109_ACCESS_2021_3073465 doaj_primary_oai_doaj_org_article_b08ab445c48945268f9d802021e7c11d |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 20210000 2021-00-00 20210101 2021-01-01 |
PublicationDateYYYYMMDD | 2021-01-01 |
PublicationDate_xml | – year: 2021 text: 20210000 |
PublicationDecade | 2020 |
PublicationPlace | Piscataway |
PublicationPlace_xml | – name: Piscataway |
PublicationTitle | IEEE access |
PublicationTitleAbbrev | Access |
PublicationYear | 2021 |
Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
References | ref12 ref15 ref14 ref30 smilkov (ref33) 2017 ref10 simonyan (ref31) 2013 ghorbani (ref11) 2019 dabkowski (ref7) 2017 wang (ref38) 2019 szegedy (ref37) 2013 springenberg (ref34) 2014 ref1 sundararajan (ref36) 2017; 70 ref17 ren (ref25) 2015 redmon (ref24) 2018 kim (ref18) 2018 krizhevsky (ref21) 2012 gu (ref13) 2018 ref26 aggarwal (ref2) 2019 chen (ref5) 2019 ref42 ref41 ref22 ref43 kingma (ref20) 2015 ref28 simonyan (ref32) 2014 ref27 schulz (ref29) 2019 ref8 ioffe (ref16) 2015 srinivas (ref35) 2019 ref9 kim (ref19) 2017 ref4 ref3 ref6 ref40 zeiler (ref39) 2014 petsiuk (ref23) 2018 |
References_xml | – year: 2014 ident: ref32 article-title: Very deep convolutional networks for large-scale image recognition publication-title: arXiv 1409 1556 – ident: ref12 doi: 10.1109/CVPR.2014.81 – ident: ref30 doi: 10.1109/ICCV.2017.74 – ident: ref9 doi: 10.1145/3219819.3220099 – start-page: 91 year: 2015 ident: ref25 article-title: Faster R-CNN: Towards real-time object detection with region proposal networks publication-title: Proc Adv Neural Inf Process Syst – year: 2015 ident: ref16 article-title: Batch normalization: Accelerating deep network training by reducing internal covariate shift publication-title: arXiv 1502 03167 – year: 2017 ident: ref33 article-title: SmoothGrad: Removing noise by adding noise publication-title: arXiv 1706 03825 – start-page: 119 year: 2018 ident: ref13 article-title: Understanding individual decisions of CNNs via contrastive backpropagation publication-title: Proc Asian Conf Comput Vis – year: 2019 ident: ref38 article-title: Score-CAM: Score-weighted visual explanations for convolutional neural networks publication-title: arXiv 1910 01279 – year: 2014 ident: ref34 article-title: Striving for simplicity: The all convolutional net publication-title: arXiv 1412 6806 – start-page: 4126 year: 2019 ident: ref35 article-title: Full-gradient representation for neural network visualization publication-title: Proc Adv Neural Inf Process Syst – ident: ref1 doi: 10.1109/ACCESS.2018.2870052 – ident: ref15 doi: 10.1109/CVPR.2016.90 – ident: ref41 doi: 10.1109/CVPR.2018.00920 – ident: ref22 doi: 10.1109/CVPR.2015.7298965 – ident: ref10 doi: 10.1109/ICCV.2017.371 – ident: ref42 doi: 10.1109/CVPR.2017.660 – year: 2013 ident: ref31 article-title: Deep inside convolutional networks: Visualising image classification models and saliency maps publication-title: arXiv 1312 6034 – ident: ref26 doi: 10.1145/2939672.2939778 – year: 2019 ident: ref2 publication-title: Fashion Product Images (Small) – start-page: 1097 year: 2012 ident: ref21 article-title: Imagenet classification with deep convolutional neural networks publication-title: Proc Adv Neural Inf Process Syst – start-page: 818 year: 2014 ident: ref39 article-title: Visualizing and understanding convolutional networks publication-title: Proc Eur Conf Comput Vis – year: 2013 ident: ref37 article-title: Intriguing properties of neural networks publication-title: arXiv 1312 6199 – ident: ref17 doi: 10.1109/ICCVW.2019.00513 – ident: ref28 doi: 10.1007/s11263-015-0816-y – year: 2018 ident: ref24 article-title: YOLOv3: An incremental improvement publication-title: arXiv 1804 02767 – year: 2019 ident: ref29 article-title: DeepView: Visualizing classification boundaries of deep neural networks in a part of the data space publication-title: arXiv 1909 09154 – year: 2018 ident: ref23 article-title: RISE: Randomized input sampling for explanation of black-box models publication-title: arXiv 1806 07421 – ident: ref27 doi: 10.1016/0167-2789(92)90242-F – ident: ref40 doi: 10.1007/s11263-017-1059-x – ident: ref8 doi: 10.1145/3359786 – start-page: 8928 year: 2019 ident: ref5 article-title: This looks like that: Deep learning for interpretable image recognition publication-title: Proc Adv Neural Inf Process Syst – year: 2017 ident: ref19 article-title: Interpretability beyond feature attribution: Quantitative testing with concept activation vectors (TCAV) publication-title: arXiv 1711 11279 – ident: ref43 doi: 10.1109/CVPR.2016.319 – volume: 70 start-page: 3319 year: 2017 ident: ref36 article-title: Axiomatic attribution for deep networks publication-title: Proc 34th Int Conf Mach Learn – ident: ref14 doi: 10.1109/ICCV.2017.322 – start-page: 2668 year: 2018 ident: ref18 article-title: Interpretability beyond feature attribution: Quantitative testing with concept activation vectors (TCAV) publication-title: Proc Int Conf Mach Learn – ident: ref4 doi: 10.1109/WACV.2018.00097 – start-page: 6967 year: 2017 ident: ref7 article-title: Real time image saliency for black box classifiers publication-title: Proc Adv Neural Inf Process Syst – ident: ref3 doi: 10.1371/journal.pone.0130140 – ident: ref6 doi: 10.1007/978-3-030-01234-2_49 – start-page: 1 year: 2015 ident: ref20 article-title: Adam: A method for stochastic optimization publication-title: Proc 3rd Int Conf Learn Represent – start-page: 9273 year: 2019 ident: ref11 article-title: Towards automatic concept-based explanations publication-title: Proc Adv Neural Inf Process Syst |
SSID | ssj0000816957 |
Score | 2.2463717 |
Snippet | Whereas Deep Neural Network(DNN) shows wonderful performance on large scale data, lacking interpretability limits their usage in scenarios relevant to... |
SourceID | doaj proquest crossref ieee |
SourceType | Open Website Aggregation Database Enrichment Source Index Database Publisher |
StartPage | 61417 |
SubjectTerms | Artificial neural networks class activation mapping Constraints Inspection Interpretability interpretability evaluation Mapping Noise measurement Optimization Pipelines Predictive models Semantics Task analysis Visual discrimination visual explanation Visualization |
SummonAdditionalLinks | – databaseName: IEEE/IET Electronic Library dbid: RIE link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1BT9swFH5inOCwDRhaN4Z84EjaJLWT-FiqVgipnGDqzYrtV4RWtag00rRfv_ccN0IMTdyiyHbivGf7e47f9wFceAKpWKU20ZWnAIVWxMTa0ifKDrX1i7zWIet9dltc38ubuZrvwWWXC4OI4fAZ9vky_Mv3a9fwVtlAE7ooSppwP1Dg1uZqdfspLCChVRmJhbJUD0bjMfWBQsA867MnS15AXiw-gaM_iqr8MxOH5WX6CWa7F2tPlfzqN1vbd39ecTa-980_w8eIM8WodYwj2MPVMRy-YB88gbvJb7ds-Pi6YCDYbFCwfGcQjUAvglymGLmd_pmY1czl8CAI5oqrkAUkfj4-N_QUPshXt9uKX-B-OrkbXydRZCFxMq22iR1KLAhiW5UtmOcIlVXlEClIJagmcSG1owlTZqktfF7UliGTy31FQXXmNAG4U9hfrVf4FUTtqTI65bCqpZXOSi1zarWmsAdLnfYg33194yIDOfdpaUIkkmrTmsywyUw0WQ8uu0pPLQHH_4tfsVm7osyeHW6QOUwcjMamVW2lVI58kyXWq4X2VcrNYOmyzPfghE3YNRKt14OznZOYONKfDWf-BrU29e3tWt_hgFtut23OYH-7afAHAZmtPQ8e_BcsAe4L priority: 102 providerName: IEEE |
Title | Exclusive Feature Constrained Class Activation Mapping for Better Visual Explanation |
URI | https://ieeexplore.ieee.org/document/9405672 https://www.proquest.com/docview/2519083105 https://doaj.org/article/b08ab445c48945268f9d802021e7c11d |
Volume | 9 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV3PS8MwFA6ykx5EnWJ1jhw82q3tkjY5bmNjCPO0yW6h-VERxpRtBf9830u7URD04rElP5qX17zvheT7CHm0AFKdiHQohYUEBSJiqHVmQ64HUtsiyaW_9T5_SWdL9rziq4bUF54Jq-iBK8P1dSRyzRg30A7KYYtCWgEYJ4ldZuLY4uoLMa-RTPk1WMSp5FlNMxRHsj8cj2FEPazYQ79mGE4aocgz9tcSKz_WZR9sphfkvEaJdFh93SU5cZsrctbgDmyTxeTLrEs8fE4RxpVbR1F800s-OEu92CUdmoN6GZ3nyMTwRgGk0pG_w0Nf33cl9ILH8PJqU_CaLKeTxXgW1hIJoWGR2Id6wFwKAFnzuECWIsc1zwYOUkwAWswVTBpY7lgc6dQmaa4R8JjECkiJYyMBft2Q1uZj424JzS1UdoYbJ3KmmdFMsgRazSFpcZmMApIcrKVMzR-OY1orn0dEUlUmVmhiVZs4IE_HSp8VfcbvxUc4DceiyH3tX4BHqNoj1F8eEZA2TuKxEQmYNM2SgHQOk6rq_3Sn8N6u11rjd__R9T05xYdqi6ZDWvtt6R4AtOx11_tn198v_AZa2uR- |
linkProvider | Directory of Open Access Journals |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV3BTtwwEB0hOBQOpS2gLtDWhx7JkmTtJD4uK9BCWU4L4mbF9myFQEsFG6ni65lxvBECVPUWRbETZyaeN47nPYCfnkAqVqlNdOUpQaGImFhb-kTZgbZ-ltc6VL1PLorxpTy7VtcrcNDVwiBi2HyGfT4M__L9vWt4qexQE7ooSppw1yjuq6yt1upWVFhCQqsyUgtlqT4cjkY0CkoC86zPviw5hLwIP4GlP8qqvJmLQ4A52YTJ8tHafSW3_WZh--7pFWvj_z77J_gYkaYYtq7xGVZw_gU2XvAPbsH0-K-7a3gDu2Ao2DygYAHPIBuBXgTBTDF0SwU0MamZzeG3IKArjkIdkLi6eWzoLryVr24XFrfh8uR4OhonUWYhcTKtFokdSCwIZFuVzZjpCJVV5QApTSWwJnEmtaMpU2apLXxe1JZBk8t9RWl15jRBuB1Ynd_P8SuI2lNjdMphVUsrnZVa5tRrTYkPljrtQb58-8ZFDnIe050JuUiqTWsywyYz0WQ9OOga_WkpOP59-RGbtbuU-bPDCTKHiZ-jsWlVWymVI-9kkfVqpn2VcjdYuizzPdhiE3adROv1YH_pJCZ-64-Ga3-DXpvafb_VD_gwnk7Ozfnpxa89WOe7tIs4-7C6eGjwG8Gahf0evPkZvR3xVA |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Exclusive+Feature+Constrained+Class+Activation+Mapping+for+Better+Visual+Explanation&rft.jtitle=IEEE+access&rft.au=Wang%2C+Pengda&rft.au=Kong%2C+Xiangwei&rft.au=Guo%2C+Weikuo&rft.au=Zhang%2C+Xunpeng&rft.date=2021&rft.issn=2169-3536&rft.eissn=2169-3536&rft.volume=9&rft.spage=61417&rft.epage=61428&rft_id=info:doi/10.1109%2FACCESS.2021.3073465&rft.externalDBID=n%2Fa&rft.externalDocID=10_1109_ACCESS_2021_3073465 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2169-3536&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2169-3536&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2169-3536&client=summon |