Skip to content

kthrn22/L-STEP

Repository files navigation

Learnable Spatial-Temporal Positional Encoding for Link Prediction (L-STEP)

This repository is the official implementation of the paper Learnable Spatial-Temporal Positional Encoding for Link Prediction (ICML 2025) by Katherine Tieu*, Dongqi Fu*, Zihao Li, Ross Maciejewski, and Jingrui He.

Data Preprocessing

Datasets

13 datasets: Wikipedia, Reddit, MOOC, LastFM, Enron, Social Evo., UCI, Flights, Can. Parl., US Legis., UN Trade, UN Vote, and Contact.

These datasets are adopted from Towards Better Evaluation for Dynamic Link Prediction, which can be downloaded from here.

After downloading the datasets, please place them in the DG_data folder.

Preprocessing

For a dataset dataset_name, run the following code to preprocess the dataset:

cd preprocess_data/
python preprocess_data.py --dataset_name [dataset_name]

For example, we preprocess the Enron dataset by running:

cd preprocess_data/
python preprocess_data.py --dataset_name enron

Executing Scripts for running Temporal Link Prediction

Train L-STEP

In order to train L-STEP on dataset dataset_name, run

python train_STEP_link_prediction.py --dataset_name [dataset_name] --model_name LSTEP --num_runs 5 --gpu [cuda index] --[other configs]

Here is an example of training L-STEP on Enron dataset:

python train_STEP_link_prediction.py --dataset_name enron --model_name LSTEP --num_runs 5 --gpu 0 --[other configs]

If you want to load the best configurations for Enron , run

python train_STEP_link_prediction.py --dataset_name enron --model_name LSTEP --num_runs 5 --gpu 0 --load_best_configs

Evaluate L-STEP with different negative sampling strategy (NSS)

We evaluate L-STEP on 3 NSS: random, historical, and inductive.

Here is an example of evaluating L-STEP on Enron with random NSS

python evaluate_LSTEP_link_prediction.py --dataset_name enron --model_name LSTEP --num_runs 5 --gpu 0 --negative_sample_strategy random --[other configs]

If you want to load the best configurations during the evaluation, run:

python evaluate_LSTEP_link_prediction.py --dataset_name enron --model_name LSTEP --num_runs 5 --gpu 0 --load_best_configs --negative_sample_strategy random --[other configs]

For historical NSS, set --negative_sample_strategy to historical, and for inductive NSS, set --negative_sample_strategy to inductive.

Acknowledgments

We are grateful to the authors of DyGFormer for making their project codes publicly available.

About

[ICML 2025] Learnable Spatial-Temporal Positional Encoding for Link Prediction

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages