RobustCheck: A Python package for black-box robustness assessment of image classifiers

The robustness of computer vision models against adversarial attacks is a critical matter in machine learning that is often overlooked by researchers and developers. A contributing factor to this oversight is the complexity involved in assessing model robustness. This paper introduces RobustCheck, a...

Full description

Saved in:
Bibliographic Details
Published inSoftwareX Vol. 27; p. 101831
Main Authors Ilie, Andrei, Stefanescu, Alin
Format Journal Article
LanguageEnglish
Published Elsevier B.V 01.09.2024
Elsevier
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The robustness of computer vision models against adversarial attacks is a critical matter in machine learning that is often overlooked by researchers and developers. A contributing factor to this oversight is the complexity involved in assessing model robustness. This paper introduces RobustCheck, a Python package designed for evaluating the adversarial robustness of computer vision models. Utilizing black-box adversarial techniques, it allows for the assessment of model resilience without internal model access, reflecting real-world application constraints. RobustCheck is distinctive for its rapid integration into development workflows and its efficiency in robustness testing. The tool provides an essential resource for developers to enhance the security and reliability of computer vision systems.
ISSN:2352-7110
2352-7110
DOI:10.1016/j.softx.2024.101831