Type of Publication: Article in Collected Edition
Poster: Automated Evaluation of Fuzzers - Distinguished Technical Poster Award
- Sebastian Surminski; Michael Rodler; Lucas Davi
- Title of Anthology:
- Proc. of 26th Network and Distributed System Security Symposium (NDSS)
- Publication Date:
- Link to complete version:
Fuzzing is a well-known technique for automatically testing the robustness of software and its susceptibility to security-critical errors. Recently, many new and improved fuzzers have been presented. One critical aspect of any new fuzzer is its overall performance. However, given that there exist no standardized fuzzing evaluation methodology, we observe significant discrepancy in evaluation results making it highly challenging to compare fuzzing techniques.
To tackle this deficiency, we developed a new framework, called FETA, which automatically evaluates fuzzers based on a fixed and comprehensive test set enabling objective and general comparison of performance results. We apply FETA to various recently released academic and non-academic fuzzers, eventually resulting in a large scale evaluation of the current state-of-the-art fuzzing approaches.