Skip to content

Implementation of a (autoregressive) transformer quantum state that can be pretrained with data from quantum simulators

Notifications You must be signed in to change notification settings

HannahLange/HybridTransformer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

26 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Transformer neural networks and quantum simulators: a hybrid approach for simulating strongly correlated systems

This is the implementation to our manuscript Lange et al., arXiv:2406.00091. The Code contains parts that are adapted from Zhang et al., PRB 107 (2023) and Sprague et al., Comm. Phys. 7 (2024).

Transformer quantum state implentation for the dipolar XY model, which can be pretrained using numerical data or experimental data, e.g. from quantum simulators as in Ref. Chen et al., Nature 616 (2023). Our hybrid training procedure consists of two parts:

  1. A data-driven pretraining: We train on snapshots from the computational basis as well as observables in different bases than the computational basis (here the X basis).
  2. An energy-driven training using variational Monte Carlo.
Momentum_git

The source code is provided in the folder src. It contains model.py and pos_encoding.py with the implementation of the autoregressive, patched transformer that can be supplemented with spatial symmetries using symmetries.py as well as localenergy.py with the implementation of the Hamiltonian. Furthermore, we provide the exemplary run files run_pretraining.py and run.py.

About

Implementation of a (autoregressive) transformer quantum state that can be pretrained with data from quantum simulators

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages