This is the code repository of "Towards Deep Attention in Graph Neural Networks: Problems and Remedies," published in ICML 2023.
Codes to reproduce node classification results in Tables 3 & 8 are provided.
Also, Poster.pdf is the poster file presented @ICML23, with some additional visualizations from the original paper.
The link to my video presentation is here.
AERO-GNN model code is in ./AERO-GNN/model.py.
Tuned hyperparameters for all models are written in shell files in ./run folder.
Datasets are in ./graph-data folder, which should be automatically downloaded when running the code.
main.py loads datasets, initializes hyperparameters, and runs the entire codes.
train_dense.py and train_sparse.py load, train, and evaluate designated GNNs for node classification.
model.py has all the models used for experiments.
layer.py has implementations of some models' graph convolution layers.
The code will run 100 trials of node classification on the designated dataset. The results for every trial will be printed every trial. The codes are saved in shell files in ./run folder.
python ./AERO-GNN/main.py --model aero --dataset chameleon --iterations 32 --dr 0.0001 --dr-prop 0.0001 --dropout 0.7 --add-dropout 0 --lambd 1.0 --num-layers 2
Running main.py will automatically download the designated datasets from PyG.
The codes to load the filtered Chameleon and Squirrel datasets, proposed by Platonov et al. (2023, ICLR), are in filtered_dataset.py.
The loading and preprocessing code for each dataset is in utils.py.
dgl==1.1.1
dgl_cu113==0.9.1.post1
numpy==1.21.2
torch==1.11.0+cu113
torch_geometric==2.1.0
torch_scatter==2.0.9
torch_sparse==0.6.13
tqdm==4.62.3
@inproceedings{lee2023towards,
title={Towards Deep Attention in Graph Neural Networks: Problems and Remedies},
author={Lee, Soo Yong and Bu, Fanchen and Yoo, Jaemin and Shin, Kijung},
booktitle={International Conference on Machine Learning},
year={2023},
organization={PMLR}
}
For any question, please email us ({syleetolow, boqvezen97, kijungs}@kaist.ac.kr, {jaeminyoo}@cmu.edu)!