This README.md
contains a set of checklists for our contest collaboration.
Your contest will use two repos:
- a contest repo (this one), which is used for scoping your contest and for providing information to contestants (wardens)
- a findings repo, where issues are submitted (shared with you after the contest)
Ultimately, when we launch the contest, this contest repo will be made public and will contain the smart contracts to be reviewed and all the information needed for contest participants. The findings repo will be made public after the contest report is published and your team has mitigated the identified issues.
Some of the checklists in this doc are for C4 (🐺) and some of them are for you as the contest sponsor (⭐️).
- Create a PR to this repo with the below changes:
- Provide a self-contained repository with working commands that will build (at least) all in-scope contracts, and commands that will run tests producing gas reports for the relevant contracts.
- Make sure your code is thoroughly commented using the NatSpec format.
- Please have final versions of contracts and documentation added/updated in this repo no less than 24 hours prior to contest start time.
- Be prepared for a 🚨code freeze🚨 for the duration of the contest — important because it establishes a level playing field. We want to ensure everyone's looking at the same code, no matter when they look during the contest. (Note: this includes your own repo, since a PR can leak alpha to our wardens!)
Under "SPONSORS ADD INFO HERE" heading below, include the following:
- Modify the bottom of this
README.md
file to describe how your code is supposed to work with links to any relevent documentation and any other criteria/details that the C4 Wardens should keep in mind when reviewing. (Here's a well-constructed example.)- When linking, please provide all links as full absolute links versus relative links
- All information should be provided in markdown format (HTML does not render on Code4rena.com)
- Under the "Scope" heading, provide the name of each contract and:
- source lines of code (excluding blank lines and comments) in each
- external contracts called in each
- libraries used in each
- Describe any novel or unique curve logic or mathematical models implemented in the contracts
- Does the token conform to the ERC-20 standard? In what specific ways does it differ?
- Describe anything else that adds any special logic that makes your approach unique
- Identify any areas of specific concern in reviewing the code
- Optional / nice to have: pre-record a high-level overview of your protocol (not just specific smart contract functions). This saves wardens a lot of time wading through documentation.
- See also: this checklist in Notion
- Delete this checklist and all text above the line below when you're ready.
- Total Prize Pool: $90,500 USDC
- HM awards: $63,750 USDC (Notion Field: Main Pool)
- QA report awards: $7,500 USDC (Notion Field: QA Pool, usually 10% of total award pool)
- Gas report awards: $3,750 USDC (Notion Field: Gas Pool, usually 5% of total award pool)
- Judge + presort awards: $15,000 USDC (Notion Field: Judge Fee)
- Scout awards: $500 USDC (this field doesn't exist in Notion yet, usually $500 USDC)
- Join C4 Discord to register
- Submit findings using the C4 form
- Read our guidelines for more details
- Starts January 31, 2023 20:00 UTC
- Ends February 07, 2023 20:00 UTC
Automated findings output for the contest can be found [here](add link to report) within an hour of contest opening.
Note for C4 wardens: Anything included in the automated findings output is considered a publicly known issue and is ineligible for awards.
[ ⭐️ SPONSORS ADD INFO HERE ]
Please provide some context about the code being audited, and identify any areas of specific concern in reviewing the code. (This is a good place to link to your docs, if you have them.)
List all files in scope in the table below (along with hyperlinks) -- and feel free to add notes here to emphasize areas of focus.
For line of code counts, we recommend using cloc.
Contract | SLOC | Purpose | Libraries used |
---|---|---|---|
contracts/folder/sample.sol | 123 | This contract does XYZ | @openzeppelin/* |
List any files/contracts that are out of scope for this audit.
Describe any novel or unique curve logic or mathematical models implemented in the contracts
Sponsor, please confirm/edit the information below.
- If you have a public code repo, please share it here:
- How many contracts are in scope?:
- Total SLoC for these contracts?:
- How many external imports are there?:
- How many separate interfaces and struct definitions are there for the contracts within scope?:
- Does most of your code generally use composition or inheritance?:
- How many external calls?:
- What is the overall line coverage percentage provided by your tests?:
- Is there a need to understand a separate part of the codebase / get context in order to audit this part of the protocol?:
- Please describe required context:
- Does it use an oracle?:
- Does the token conform to the ERC20 standard?:
- Are there any novel or unique curve logic or mathematical models?:
- Does it use a timelock function?:
- Is it an NFT?:
- Does it have an AMM?:
- Is it a fork of a popular project?:
- Does it use rollups?:
- Is it multi-chain?:
- Does it use a side-chain?:
Provide every step required to build the project from a fresh git clone, as well as steps to run the tests with a gas report.
Note: Many wardens run Slither as a first pass for testing. Please document any known errors with no workaround.