Transformer neural networks and quantum simulators: a hybrid approach for simulating strongly correlated systems
This is the implementation to our manuscript Lange et al., arXiv:2406.00091. The Code contains parts that are adapted from Zhang et al., PRB 107 (2023) and Sprague et al., Comm. Phys. 7 (2024).
Transformer quantum state implentation for the dipolar XY model, which can be pretrained using numerical data or experimental data, e.g. from quantum simulators as in Ref. Chen et al., Nature 616 (2023). Our hybrid training procedure consists of two parts:
- A data-driven pretraining: We train on snapshots from the computational basis as well as observables in different bases than the computational basis (here the X basis).
- An energy-driven training using variational Monte Carlo.
The source code is provided in the folder src. It contains model.py
and pos_encoding.py
with the implementation of the autoregressive, patched transformer that can be supplemented with spatial symmetries using symmetries.py
as well as localenergy.py
with the implementation of the Hamiltonian. Furthermore, we provide the exemplary run files run_pretraining.py
and run.py
.