Skip to content

filipekstrm/graph_ardm

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Graph ARDMs

This is the official public repository for the two papers

  • Discriminator Guidance for Autoregressive Diffusion Models, AISTATS 2024, PMLR
  • Autoregressive Diffusion Models with non-Uniform Generation Orders, ICML 2023 Workshop on Structured Probabilistic Inference and Generative Modeling, OpenReview

by Filip Ekström Kelvinius and Fredrik Lindsten.

Installation

See INSTALL.md for instructions on how to install required packages

Training

All parameters used in this work are either default or in the respective configs.

If you use a config for training, its values will be overwritten if they are also set explicitly on the command line.

ARDM

To train a generative model, a minimal working example is running

python main.py --config configs/qm9/qm9_ardm_uniform_config.json --logger [none/wandb]

Generation of traning set

After training a generator, a training set for training the discriminator can be generated by running

python main.py --mode generate --load [path to wandb run directory of generative model]

Discriminator

To train a discriminator, a minimal command is

python main.py --config configs/qm9/qm9_discr_uniform_config.json --gen_data_path [path to generated training set] --logger [none/wandb]

Initializing the discriminator with the weights of a pre-trained model can be done by passing the argument --load [path to wandb run directory of ARDM] (this was done with all our experiments)

Evaluation

An ARDM can be evaluated by running

python main.py --mode evaluate --load [path to wandb run directory of ARDM]

If running with discriminator guidance, you have to add --load_discriminator [path to wandb run directory of discriminator] --guidance_mode [ardg/bsdg/fadg].

Options for the SMC methods that can be set:

  • --num_particles Number of particles per sample. If --num_particles -1, number of particles=number of samples, and the samples are (highly) correlated
  • --ess_ratio When resampling is performed (a value between 0 and 1)
  • --pf_sampling Type of resampler, i.e., multinomial, stratified, systematic

Citation

@InProceedings{ekstrom_kelvinius_discriminator_2024,
  title = 	 { Discriminator Guidance for Autoregressive Diffusion Models },
  author =       {Ekstr\"{o}m Kelvinius, Filip and Lindsten, Fredrik},
  booktitle = 	 {Proceedings of The 27th International Conference on Artificial Intelligence and Statistics},
  pages = 	 {3403--3411},
  year = 	 {2024},
  editor = 	 {Dasgupta, Sanjoy and Mandt, Stephan and Li, Yingzhen},
  volume = 	 {238},
  series = 	 {Proceedings of Machine Learning Research},
  month = 	 {02--04 May},
  publisher =    {PMLR},
  pdf = 	 {https://proceedings.mlr.press/v238/ekstrom-kelvinius24a/ekstrom-kelvinius24a.pdf},
  url = 	 {https://proceedings.mlr.press/v238/ekstrom-kelvinius24a.html},
  abstract = 	 { We introduce discriminator guidance in the setting of Autoregressive Diffusion Models. The use of a discriminator to guide a diffusion process has previously been used for continuous diffusion models, and in this work we derive ways of using a discriminator together with a pretrained generative model in the discrete case. First, we show that using an optimal discriminator will correct the pretrained model and enable exact sampling from the underlying data distribution. Second, to account for the realistic scenario of using a sub-optimal discriminator, we derive a sequential Monte Carlo algorithm which iteratively takes the predictions from the discriminator into account during the generation process. We test these approaches on the task of generating molecular graphs and show how the discriminator improves the generative performance over using only the pretrained model. }
}


@inproceedings{ekstrom_kelvinius_autoregressive_2023,
  title = {Autoregressive {{Diffusion Models}} with Non-{{Uniform Generation Order}}},
  booktitle = {{{ICML}} 2023 {{Workshop}} on {{Structured Probabilistic Inference}} \& {{Generative Modeling}}},
  author = {Ekstr{\"o}m Kelvinius, Filip and Lindsten, Fredrik},
  year = {2023},
  address = {Honolulu, HI, USA},
  month = jul,
 }

Ackowledgements

Much of the code was adapted from the repository of DiGress, which is licensed under an MIT license (see the license text in the corresponding source files). Consider citing their paper

@inproceedings{vignac2023digress,
    title={DiGress: Discrete Denoising diffusion for graph generation},
    author={Clement Vignac and Igor Krawczuk and Antoine Siraudin and Bohan Wang and Volkan Cevher and Pascal Frossard},
    booktitle={The Eleventh International Conference on Learning Representations },
    year={2023},
    url={https://openreview.net/forum?id=UaAD-Nu86WX}
}

About

Code for papers

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages