RapidFuzz: Accelerating fuzzing via Generative Adversarial Networks

We implement a Generative Adversarial Network (GAN) based fuzzer called RapidFuzz to generate synthetic testcase, which can precisely catch the data structure feature in a relatively shorter time than the state-of-art fuzzers. RapidFuzz provides potential seeds generated by GAN. i.e., The generated...

Full description

Saved in:
Bibliographic Details
Published inNeurocomputing (Amsterdam) Vol. 460; pp. 195 - 204
Main Authors Ye, Aoshuang, Wang, Lina, Zhao, Lei, Ke, Jianpeng, Wang, Wenqi, Liu, Qinliang
Format Journal Article
LanguageEnglish
Published Elsevier B.V 14.10.2021
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We implement a Generative Adversarial Network (GAN) based fuzzer called RapidFuzz to generate synthetic testcase, which can precisely catch the data structure feature in a relatively shorter time than the state-of-art fuzzers. RapidFuzz provides potential seeds generated by GAN. i.e., The generated seeds with similar but different numerical distributions accelerate the mutation process. An algorithm is elaborately designed to locate the hot-points generated by GAN. The generated testcases make structural features easier to be identified, which makes the whole process faster. In our experiment, RapidFuzz considerably improves the performance of American Fuzzy Lop(AFL) in speed, coverage, and mapsize. We select 9 open-sourced programs with different highly-structured inputs to demonstrate the effectiveness of RapidFuzz. As a result, code coverage is significantly improved. For tiff2pdf and tiffdump, coverage increase exceeds over 20%. We also observe that RapidFuzz achieves the same coverage with less time than AFL. Furthermore, AFL absorbs 21% of generated seed files in tiff2pdf with an average absorption rate around 15% in other programs.
ISSN:0925-2312
1872-8286
DOI:10.1016/j.neucom.2021.06.082