Skip to content

Latest commit

 

History

History
43 lines (33 loc) · 1.59 KB

README.md

File metadata and controls

43 lines (33 loc) · 1.59 KB

Mowst: Mixture of Weak and Strong Experts on Graphs

Hanqing Zeng* (zengh@meta.com), Hanjia Lyu* (hlyu5@ur.rochester.edu), Diyi Hu, Yinglong Xia, Jiebo Luo

*: equal contribution

Paper

Example commands

  1. run vanilla GCN on flickr, one run
python main.py --dataset flickr --method baseline --model2 GCN
  1. run Mowst*-GCN on ogbn-products, one run, and the input features for the gating module only contain dispersion
python main.py --dataset product --method mowst_star --model2 GCN --original_data false
  1. run Mowst-Sage on pokec, one run, and the input features for the gating module contain dispersion and the node self-features
python main.py --dataset pokec --method mowst --model2 Sage --original_data true
  1. run Mowst-Sage on penn94 (grid search, 10 runs), and the input features for the gating module contain dispersion and the node self-features
python main.py --dataset penn94 --method mowst --model2 Sage --original_data true --setting ten

License

Mowst is MIT licensed, as found in the LICENSE file.

Citation

@inproceedings{mowstgnn-iclr24,
title={Mixture of Weak and Strong Experts on Graphs},
author={Hanqing Zeng and Hanjia Lyu and Diyi Hu and Yinglong Xia and Jiebo Luo},
booktitle={International Conference on Learning Representations},
year={2024},
url={https://openreview.net/forum?id=wYvuY60SdD}
}