Skip to content

acorn421/MaxAFL_public

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MaxAFL

Overview

  • Maximize code coverage using optimization algorithm.

Main Idea

  • construct global objective function for optimization based on static code analysis.
  • implement fast optimization algorithm using gradient descent.

Implementation

  • implement instrumentation for fuzzing using LLVM Pass.

    Dev environment

    • Ubuntu 18.04
    • Visual Studio Code

    Dependencies

    • LLVM 8.0.0
    • Boost Graph Library 1.6.2

Test

LLVM Pass Test

  • opt -load (path_to_so_file)/FuncBlockCount.so -funcblockcount sample.ll
  • clang -O0 -S -emit-llvm sample.c -o sample.ll

Paper Work

Related Works

  1. Angora
    • use gradient descent to explore path.
    • use taint analysis and type inference for efficient gradient descent.
    • pdf, github
  2. NEUZZ
    • train Deep Neural Network that predict branch coverage(TBA).
    • pdf, github
  3. GRsan(Proximal gradient analysis)
    • use chain rule to compute gradient of variable w.r.t. input.
    • pdf, source code is not available yet.

Links

  1. Fuzzing corpus
  2. Fuzzing targets

Analyze Result

  • gen_cov.py -> coverage.py -> plot.py