fakeWeather: Adversarial Attacks for Deep Neural Networks Emulating Weather Conditions on the Camera Lens of Autonomous Systems

Recently, Deep Neural Networks (DNNs) have achieved remarkable performances in many applications, while several studies have enhanced their vulnerabilities to malicious attacks. In this paper, we emulate the effects of natural weather conditions to introduce plausible perturbations that mislead the...

Full description

Saved in:
Bibliographic Details
Published in2022 International Joint Conference on Neural Networks (IJCNN) pp. 1 - 9
Main Authors Marchisio, Alberto, Caramia, Giovanni, Martina, Maurizio, Shafique, Muhammad
Format Conference Proceeding
LanguageEnglish
Published IEEE 18.07.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Recently, Deep Neural Networks (DNNs) have achieved remarkable performances in many applications, while several studies have enhanced their vulnerabilities to malicious attacks. In this paper, we emulate the effects of natural weather conditions to introduce plausible perturbations that mislead the DNNs. By observing the effects of such atmospheric perturbations on the camera lenses, we model the patterns to create different masks that fake the effects of rain, snow, and hail. Even though the perturbations introduced by our attacks are visible, their presence remains unnoticed due to their association with natural events, which can be especially catastrophic for fully-autonomous and unmanned vehicles. We test our proposed fake Weather attacks on multiple Convolutional Neural Network and Capsule Network models, and report noticeable accuracy drops in the presence of such adversarial perturbations. Our work introduces a new security threat for DNNs, which is especially severe for safety-critical applications and autonomous systems.
ISSN:2161-4407
DOI:10.1109/IJCNN55064.2022.9892612