Track 1: Classical Shadows with Transformer Architecture
This project implements a machine learning model capable of reconstructing a quantum density matrix
- /src: Contains core source code for the model (
model.py), data generation (data_gen.py), and training loops (train.py). - /outputs: Stores saved model weights (
model_weights.pt) and training logs. - /docs: Detailed technical documentation and replication guides.
Final trained model | Loss: 0.0030 | Fidelity: 0.9872 | TraceDist: 0.0770 | Latency: 5.63ms Model saved to outputs/model_weights.pt. View metrics there
In compliance with the QCG PaAC Open Project guidelines:
- Tools Used: Google Gemini (https://gemini.google.com/share/82a098e65b9e)
-
Usage:
- Generated PyTorch boilerplate for the
ShadowReconstructorclass. - Debugged tensor shape mismatches in Cholesky decomposition.
- Generated PyTorch boilerplate for the
-
Verification:
- Math verified against standard quantum mechanics textbooks (
$\rho = LL^{\dagger}$ ). - Fidelity metric cross-referenced with standard library implementations.
- Final results say it all. A Fidelity of 98.72% is considered really good in QST. A trace distance value of 0.077 is very low, corroborating the high fidelity score. The loss drops rapidly from 0.0817 to 0.0030 and stabilizes suggesting that the model has finished learning. A latency of 5-6 milliseconds is very fast, showing that the model can quickly recognize the states now.
- Math verified against standard quantum mechanics textbooks (