Skip to content

AtomBench: A Benchmark for Generative Atomic Structure Models using GPT, Diffusion, and Flow Architectures https://arxiv.org/abs/2510.16165

License

atomgptlab/atombench_inverse

Repository files navigation

Atombench

The rapid development of generative AI models for materials discovery has created an urgent need for standardized benchmarks to evaluate their performance. In this work, we present $\textbf{AtomBench}$, a systematic benchmarking framework that comparatively evaluates three representative generative architectures-AtomGPT (transformer-based), CDVAE (diffusion variational autoencoder), and FlowMM (Riemannian flow matching)-for inverse crystal structure design. We train and evaluate these models on two high-quality DFT superconductivity datasets: JARVIS Supercon-3D and Alexandria DS-A/B, comprising over 9,000 structures with computed electron-phonon coupling properties.

Install dependencies

  1. Install models as submodules
  2. Install mamba to speed up conda env creation
  3. Install base python dependencies
git submodule update --init --recursive
conda install -n base -c conda-forge mamba
pip install uv dvc snakemake

Compute benchmarks

  1. Navigate to scripts/absolute_path.sh and populate with the absolute path to this repository
  2. Run this command to automatically recompute benchmarks, metrics, and figures:
snakemake all --cores all

Installation & Usage Tutorials

About

AtomBench: A Benchmark for Generative Atomic Structure Models using GPT, Diffusion, and Flow Architectures https://arxiv.org/abs/2510.16165

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •