This is a repository that backs the results generated for https://speed.fluxml.ai
It is a collection of benchmarking runs for a subset of modeling done in the FluxML ecosystem and also serves as a means of tracking progress.
To run the benchmarks locally:
- clone this repository
cd
in to the local copy viacd FluxBench.jl
- open Julia and call
] instantiate
And finally:
julia> using FluxBench
julia> FluxBench.bench()
To contribute benchmarks one needs to:
- add in the script(s) to the
src/packages
directory with the required dependencies and code needed to run the benchmarks- Note: remember to add a
group
to theSUITE
variable via theaddgroup!(SUITE, "name/of/benchmark/group")
- Treat
group
as a dictionary and new benchmarks can be added via assigning results to group as:group["name_of_benchmark"] = @benchmarkable ...
- Please use the macro
@benchmarkable
to set up the benchmarks (see BenchmarkTools.jl for a reference) - Please follow the performance, profiling and benchmarking guides of the different packages used in the benchmark. Examples include - Julia's, Flux's, CUDA's, BenchmarkTools
- Note: remember to add a
- include the benchmarks in the top level file
src/FluxBench.jl
- call the benchmarks in the
bench
function located in filesrc/bench.jl