Accepted at the First Learning on Graphs Conference 2022
@InProceedings{arnaiz2022diffwire,
title = {{DiffWire: Inductive Graph Rewiring via the Lov{\'a}sz Bound}},
author = {Arnaiz-Rodr{\'i}guez, Adri{\'a}n and Begga, Ahmed and Escolano, Francisco and Oliver, Nuria M},
booktitle = {Proceedings of the First Learning on Graphs Conference},
pages = {15:1--15:27},
year = {2022},
editor = {Rieck, Bastian and Pascanu, Razvan},
volume = {198},
series = {Proceedings of Machine Learning Research},
month = {09--12 Dec},
publisher = {PMLR},
pdf = {https://proceedings.mlr.press/v198/arnaiz-rodri-guez22a/arnaiz-rodri-guez22a.pdf},
url = {https://proceedings.mlr.press/v198/arnaiz-rodri-guez22a.html}
}
Conda environment
conda create --name <env> --file requirements.txt
or
conda env create -f environment_experiments.yml
conda activate DiffWire
datasets/
: script for creating synthetic datasets. For non-synthetic ones: we use PyG intrain.py
layers/
: Implementation of the proposed GAP-Layer, CT-Layer, and the baseline MinCutPool (based on his repo).tranforms/
: Implementation og graph preprocessing baselines DIGL and SDRF, both based on the official repositories of the work.trained_models/
: files with the weight of some trained models.nets.py
: Implementation of GNNs used in our experiments.train.py
: Script with inline arguments for running the experiments.
python train.py --dataset REDDIT-BINARY --model CTNet --cuda cuda:0
python train.py --dataset REDDIT-BINARY --model GAPNet --derivative laplacian --cuda cuda:0
python train.py --dataset REDDIT-BINARY --model GAPNet --derivative normalizeed --cuda cuda:0
experiments_all.sh
list all the experiments.
See jupyter notebook examples at the tutorial presented at The First Learning on Graphs Conference: Graph Rewiring: From Theory to Applications in Fairness