Pranjali Thakur
Cleanup README
f1f4be4
metadata
language:
  - py
license:
  - mit
pretty_name: W3SA Solidity Access Control Benchmark

W3SA - Solidity Access Control Benchmark

This benchmark includes 18 test High Severity Access Control vulnerabilities, derived from real world contracts audited through Code4rena competitions. Credits to Zhuo for the initial data scraping and curation. We add a python wrapper and standardized evaluation framework for the evaluation of AI Models.

Project Statistics

These are from 12 different projects, and only the files containing the bug are included in this benchmark.

Number of Projects Total number of Files Total number of bugs
12 16 18
Solidity Access Control Benchmark

Repo Structure

The benchmark is contained within the benchmark directory and is made by:

  • A list of folders (projects) with one or more .sol files
  • The ground_truth.csv file which contains the labelled vulnerabilities of all projects
β”œβ”€β”€ README.md
β”œβ”€β”€ benchmark
β”‚   β”œβ”€β”€ contracts/
β”‚   └── ground_truth.csv
β”œβ”€β”€ bm_src
β”‚   β”œβ”€β”€ eval.py
β”‚   β”œβ”€β”€ llm.py
β”‚   β”œβ”€β”€ metrics.py
β”‚   └── util.py
β”œβ”€β”€ experiment.py
..

Set up

  • Install uv package manager if not yet available
  • Run uv sync

Run an experiment

Launch your experiment by running:

uv run experiment.py --model o3-mini

A tqdm progress bar will pop up and, at the end of the experiment, the results metrics will be printed out.

Contact Us

For or questions, suggestions, or to learn more about Almanax.ai, reach out to us at https://www.almanax.ai/contact