Code for "Weisfeiler and Leman go sparse: Towards scalable higher-order graph embeddings" (NeurIPS 2020).
Python 3.8eigen3numpypandasscipysklearntorch 1.5torch-geometric 1.5pybind11libsvm
All results in the paper and the appendix can be reproduced by the following the steps below.
cd kernels- Download datasets from
www.graphlearning.io, and place the unzipped folders intokernels/datasets - Download
https://www.chrsmrrs.com/wl_goes_sparse_matrices/EXP.zipandhttps://www.chrsmrrs.com/wl_goes_sparse_matrices/EXPSPARSE.zipand unzip them intokernels/svm/GM cd svm- Run
python svm.py
cd kernels- Download datasets from
www.graphlearning.io, and place the unzipped folders intokernels/datasets - Run
g++ main.cpp src/*cpp -std=c++11 -o local -O2 - Run
./local(running times will be outputted on the screen, too) cd svm- Run
python svm.py
cd neural baselines- Run
python main_gnn.py
You first need to build the Python package:
-
cd neural_higher_order/preprocessing -
You might need to adjust the path to
pybindinpreprocessing.cpp, then run- MaxOS: c++ -O3 -shared -std=c++11 -undefined dynamic_lookup
python3 -m pybind11 --includespreprocessing.cpp src/*cpp -o ../preprocessingpython3-config --extension-suffix - Linux: c++ -O3 -shared -std=c++11 -fPIC
python3 -m pybind11 --includespreprocessing.cpp src/*cpp -o ../preprocessingpython3-config --extension-suffix
- MaxOS: c++ -O3 -shared -std=c++11 -undefined dynamic_lookup
-
Run the Python scripts in
Alchemy,QM9,ZINCto reproduce the scores and running times- For example:
cd Alchemy,python local_2_FULL.pyto reproduce the scores for the \delta-2-LGNN on theAlchemydataset
- For example: