基于梯度丢弃和注意力引导的稀疏对抗攻击

TP183; 深度神经网络极易受到外部有意生成的对抗样本的影响,这些对抗样本是通过在干净图像上叠加微小的噪声来实现的.然而,大多数现有的基于转移的攻击方法选择在原始图像的每个像素上以相同的权重添加扰动,导致对抗样本出现冗余噪声,使其更容易被检测系统识别.鉴于此,该文引入了一种新颖的由注意力引导的稀疏对抗攻击策略,该策略结合了梯度丢弃技术,可以与现有的基于梯度的算法结合使用,从而最小化扰动的强度和规模,同时确保对抗样本的有效性.具体而言,在梯度丢弃阶段,策略随机丢弃一些相对不重要的梯度信息,以限制扰动的强度;在注意力引导阶段,通过使用软掩码优化的注意力机制评估每个像素对模型输出的影响,并限制对输...

Full description

Saved in:
Bibliographic Details
Published in东华大学学报(英文版) Vol. 41; no. 5; pp. 545 - 556
Main Authors 赵鸿志, 郝灵广, 郝矿荣, 隗兵, 刘肖燕
Format Journal Article
LanguageChinese
Published 东华大学数字化纺织服装技术教育部工程研究中心,上海 201620 2024
东华大学信息科学与技术学院,上海 201620
Subjects
Online AccessGet full text

Cover

Loading…
Abstract TP183; 深度神经网络极易受到外部有意生成的对抗样本的影响,这些对抗样本是通过在干净图像上叠加微小的噪声来实现的.然而,大多数现有的基于转移的攻击方法选择在原始图像的每个像素上以相同的权重添加扰动,导致对抗样本出现冗余噪声,使其更容易被检测系统识别.鉴于此,该文引入了一种新颖的由注意力引导的稀疏对抗攻击策略,该策略结合了梯度丢弃技术,可以与现有的基于梯度的算法结合使用,从而最小化扰动的强度和规模,同时确保对抗样本的有效性.具体而言,在梯度丢弃阶段,策略随机丢弃一些相对不重要的梯度信息,以限制扰动的强度;在注意力引导阶段,通过使用软掩码优化的注意力机制评估每个像素对模型输出的影响,并限制对输出影响较小的像素的扰动,以控制扰动的规模.在NeurIPS 2017对抗数据集和ILSVRC 2012验证数据集上的大量实验证明了该策略可以显著减少对抗样本中的冗余噪声,同时保持算法的攻击效果.例如,在对于对抗训练模型的攻击中,将对抗攻击算法引入该策略后,注入图像的平均噪声水平下降了 8.32%,而平均攻击成功率仅下降了 0.34%.此外,只需引入少量扰动,该策略便能显著提高攻击成功率.
AbstractList TP183; 深度神经网络极易受到外部有意生成的对抗样本的影响,这些对抗样本是通过在干净图像上叠加微小的噪声来实现的.然而,大多数现有的基于转移的攻击方法选择在原始图像的每个像素上以相同的权重添加扰动,导致对抗样本出现冗余噪声,使其更容易被检测系统识别.鉴于此,该文引入了一种新颖的由注意力引导的稀疏对抗攻击策略,该策略结合了梯度丢弃技术,可以与现有的基于梯度的算法结合使用,从而最小化扰动的强度和规模,同时确保对抗样本的有效性.具体而言,在梯度丢弃阶段,策略随机丢弃一些相对不重要的梯度信息,以限制扰动的强度;在注意力引导阶段,通过使用软掩码优化的注意力机制评估每个像素对模型输出的影响,并限制对输出影响较小的像素的扰动,以控制扰动的规模.在NeurIPS 2017对抗数据集和ILSVRC 2012验证数据集上的大量实验证明了该策略可以显著减少对抗样本中的冗余噪声,同时保持算法的攻击效果.例如,在对于对抗训练模型的攻击中,将对抗攻击算法引入该策略后,注入图像的平均噪声水平下降了 8.32%,而平均攻击成功率仅下降了 0.34%.此外,只需引入少量扰动,该策略便能显著提高攻击成功率.
Abstract_FL Deep neural networks are extremely vulnerable to externalities from intentionally generated adversarial examples which are achieved by overlaying tiny noise on the clean images.However,most existing transfer-based attack methods are chosen to add perturbations on each pixel of the original image with the same weight,resulting in redundant noise in the adversarial examples,which makes them easier to be detected.Given this deliberation,a novel attention-guided sparse adversarial attack strategy with gradient dropout that can be readily incorporated with existing gradient-based methods is introduced to minimize the intensity and the scale of perturbations and ensure the effectiveness of adversarial examples at the same time.Specifically,in the gradient dropout phase,some relatively unimportant gradient information is randomly discarded to limit the intensity of the perturbation.In the attention-guided phase,the influence of each pixel on the model output is evaluated by using a soft mask-refined attention mechanism,and the perturbation of those pixels with smaller influence is limited to restrict the scale of the perturbation.After conducting thorough experiments on the NeurIPS 2017 adversarial dataset and the ILSVRC 2012 validation dataset,the proposed strategy holds the potential to significantly diminish the superfluous noise present in adversarial examples,all while keeping their attack efficacy intact.For instance,in attacks on adversarially trained models,upon the integration of the strategy,the average level of noise injected into images experiences a decline of 8.32%.However,the average attack success rate decreases by only 0.34%.Furthermore,the competence is possessed to substantially elevate the attack success rate by merely introducing a slight degree of perturbation.
Author 刘肖燕
郝灵广
赵鸿志
隗兵
郝矿荣
AuthorAffiliation 东华大学信息科学与技术学院,上海 201620;东华大学数字化纺织服装技术教育部工程研究中心,上海 201620
AuthorAffiliation_xml – name: 东华大学信息科学与技术学院,上海 201620;东华大学数字化纺织服装技术教育部工程研究中心,上海 201620
Author_FL HAO Kuangrong
HAO Lingguang
WEI Bing
ZHAO Hongzhi
LIU Xiaoyan
Author_FL_xml – sequence: 1
  fullname: ZHAO Hongzhi
– sequence: 2
  fullname: HAO Lingguang
– sequence: 3
  fullname: HAO Kuangrong
– sequence: 4
  fullname: WEI Bing
– sequence: 5
  fullname: LIU Xiaoyan
Author_xml – sequence: 1
  fullname: 赵鸿志
– sequence: 2
  fullname: 郝灵广
– sequence: 3
  fullname: 郝矿荣
– sequence: 4
  fullname: 隗兵
– sequence: 5
  fullname: 刘肖燕
BookMark eNrjYmDJy89LZWCQMzTQM7S0sDDRz9IzNDM30jU1MjLQMzIwMjY0MjAwZmHghItyMPAWF2cmGRgYGpoBxUw4GSyfzt_1ZFffs0Xrn-5a9mTHoqd7mp9O6nm2ecWzlv6nXbOf7pn6dP2e57Nanq9oeD6t_-n6nc-6pj-bsvtp-24eBta0xJziVF4ozc2g6eYa4uyhW56Yl5aYlx6flV9alAeUiU_JSKmoSIpPBTrJxMDUwNDAmBS1AEHAVtY
ClassificationCodes TP183
ContentType Journal Article
Copyright Copyright © Wanfang Data Co. Ltd. All Rights Reserved.
Copyright_xml – notice: Copyright © Wanfang Data Co. Ltd. All Rights Reserved.
DBID 2B.
4A8
92I
93N
PSX
TCJ
DOI 10.19884/j.1672-5220.202312003
DatabaseName Wanfang Data Journals - Hong Kong
WANFANG Data Centre
Wanfang Data Journals
万方数据期刊 - 香港版
China Online Journals (COJ)
China Online Journals (COJ)
DatabaseTitleList
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
DocumentTitle_FL Attention-Guided Sparse Adversarial Attacks with Gradient Dropout
EndPage 556
ExternalDocumentID dhdxxb_e202405010
GroupedDBID -02
-0B
-SB
-S~
188
2B.
4A8
5VR
5XA
5XC
8RM
92D
92I
92M
93N
9D9
9DB
ABJNI
ACGFS
ADMLS
AFUIB
ALMA_UNASSIGNED_HOLDINGS
CAJEB
CCEZO
CDRFL
CHBEP
CW9
FA0
JUIAU
PSX
Q--
R-B
RT2
S..
T8R
TCJ
TGH
TTC
U1F
U1G
U5B
U5L
UGNYK
UZ2
UZ4
ID FETCH-wanfang_journals_dhdxxb_e2024050103
ISSN 1672-5220
IngestDate Thu May 29 03:59:43 EDT 2025
IsPeerReviewed true
IsScholarly true
Issue 5
Keywords 对抗转移性
adversarial attack
对抗攻击
adversarial transferability
sparse adversarial attack
深度神经网络
稀疏对抗攻击
adversarial example
对抗样本
deep neural network
Language Chinese
LinkModel OpenURL
MergedId FETCHMERGED-wanfang_journals_dhdxxb_e2024050103
ParticipantIDs wanfang_journals_dhdxxb_e202405010
PublicationCentury 2000
PublicationDate 2024
PublicationDateYYYYMMDD 2024-01-01
PublicationDate_xml – year: 2024
  text: 2024
PublicationDecade 2020
PublicationTitle 东华大学学报(英文版)
PublicationTitle_FL Journal of Donghua University(English Edition)
PublicationYear 2024
Publisher 东华大学数字化纺织服装技术教育部工程研究中心,上海 201620
东华大学信息科学与技术学院,上海 201620
Publisher_xml – name: 东华大学数字化纺织服装技术教育部工程研究中心,上海 201620
– name: 东华大学信息科学与技术学院,上海 201620
SSID ssib001166724
ssj0000627409
ssib018830140
ssib040214605
ssib022315852
ssib051367670
ssib006703047
Score 4.7217064
Snippet TP183;...
SourceID wanfang
SourceType Aggregation Database
StartPage 545
Title 基于梯度丢弃和注意力引导的稀疏对抗攻击
URI https://d.wanfangdata.com.cn/periodical/dhdxxb-e202405010
Volume 41
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnR1Na9RAdKjbix7ET_ymiHMQ2ZrPycwxySYUUS9W6G1JNht72oJuofQkWBCKWBA_QA9CD9JbD150uz-n2f0bvvcyTQJdcBWW8PLmzfvMZt7Mzrxl7F6WUCKbtHsZTFGwZDf-SJi2jTzv2Xk_zfop7fJ9KlaeO4_W3LWF1pnGrqXNYbrc2555ruR_ogo4iCuekv2HyFZMAQEwxBeuEGG4zhVjHrlcxTzweeTgVUY8Ety3uB9jE2B8QU0SkYgJubSpl8VliMSBzX2JgISkknpJn6tAEysXAeAGcORx5SMZANBFGoQRuhfSKOID3T0EFMglPtIDoJkEa5VUSK0dUhs4ONz3COiQ2hVAPH3QhNSQoC1ID3hgkhSB_EETqaippKlWN5A2cPETKZQZlI6JufJqEoVOUR3iYhKti8YE8UwSdHhMOoB6dpNElaaDTSixuZ5i1Sup8xkv0PWBoTGaa4jWggYQVww5upWMF8gNWIFOvo3CS5dhhKgJnwZiqBTpbfHA0iZh7EGQR_4t4xoQYGD8EAA_VM9QRzsPOlqhRoIgfIyQyQNI84RlNMY54eEahEbpgbCsQKa_8G5jVHPLip86QXLLSvCnxl4l4RGkwfeE-bKF1QVx-2OdbVR7QLP1bGsr7fYxBoZLxyQXLZjrWS226HeePH5WZ_WmAI71sCFwlKpnwaaUtE5wcg8Jr-mWJQHo3qE_r6-HNbesUmhUS7FYqduhzWCV5rqWAJr0cKZBdMZvkCeDF410dPUCO6_nkUt--VK4yBa21y-xc43qopeZKr6PjkfvJ_uHxejH8a_9Yvym-PBu8vNgsrNX7H4rxp-Kw_H068704PX0815x-Huy-2Xy8ah4e3SF3Y-j1XClrYV39dvoVfeUO-2rrDXYGPSvsSWZJLnpZnZu58Lpp17iZanoealIlMisnnWd3f07vxvzEN1kZxEu1yhvsdbw5Wb_NmTtw_SOjuofvn-ryw
linkProvider EBSCOhost
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=%E5%9F%BA%E4%BA%8E%E6%A2%AF%E5%BA%A6%E4%B8%A2%E5%BC%83%E5%92%8C%E6%B3%A8%E6%84%8F%E5%8A%9B%E5%BC%95%E5%AF%BC%E7%9A%84%E7%A8%80%E7%96%8F%E5%AF%B9%E6%8A%97%E6%94%BB%E5%87%BB&rft.jtitle=%E4%B8%9C%E5%8D%8E%E5%A4%A7%E5%AD%A6%E5%AD%A6%E6%8A%A5%EF%BC%88%E8%8B%B1%E6%96%87%E7%89%88%EF%BC%89&rft.au=%E8%B5%B5%E9%B8%BF%E5%BF%97&rft.au=%E9%83%9D%E7%81%B5%E5%B9%BF&rft.au=%E9%83%9D%E7%9F%BF%E8%8D%A3&rft.au=%E9%9A%97%E5%85%B5&rft.date=2024&rft.pub=%E4%B8%9C%E5%8D%8E%E5%A4%A7%E5%AD%A6%E6%95%B0%E5%AD%97%E5%8C%96%E7%BA%BA%E7%BB%87%E6%9C%8D%E8%A3%85%E6%8A%80%E6%9C%AF%E6%95%99%E8%82%B2%E9%83%A8%E5%B7%A5%E7%A8%8B%E7%A0%94%E7%A9%B6%E4%B8%AD%E5%BF%83%2C%E4%B8%8A%E6%B5%B7+201620&rft.issn=1672-5220&rft.volume=41&rft.issue=5&rft.spage=545&rft.epage=556&rft_id=info:doi/10.19884%2Fj.1672-5220.202312003&rft.externalDocID=dhdxxb_e202405010
thumbnail_s http://utb.summon.serialssolutions.com/2.0.0/image/custom?url=http%3A%2F%2Fwww.wanfangdata.com.cn%2Fimages%2FPeriodicalImages%2Fdhdxxb-e%2Fdhdxxb-e.jpg