Skip to content

The code repository of "Towards Deep Attention in Graph Neural Networks: Problems and Remedies," published in ICML 2023.

Notifications You must be signed in to change notification settings

syleeheal/AERO-GNN

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

31 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AERO-GNN

This is the code repository of "Towards Deep Attention in Graph Neural Networks: Problems and Remedies," published in ICML 2023.
Codes to reproduce node classification results in Tables 3 & 8 are provided.

Also, Poster.pdf is the poster file presented @ICML23, with some additional visualizations from the original paper.
The link to my video presentation is here.

Table 3

image

Basics

AERO-GNN model code is in ./AERO-GNN/model.py.
Tuned hyperparameters for all models are written in shell files in ./run folder.
Datasets are in ./graph-data folder, which should be automatically downloaded when running the code.

Codes in AERO-GNN Folder

main.py loads datasets, initializes hyperparameters, and runs the entire codes.
train_dense.py and train_sparse.py load, train, and evaluate designated GNNs for node classification.
model.py has all the models used for experiments.
layer.py has implementations of some models' graph convolution layers.

Run Code

The code will run 100 trials of node classification on the designated dataset. The results for every trial will be printed every trial. The codes are saved in shell files in ./run folder.

Example: Running Chameleon

python ./AERO-GNN/main.py --model aero --dataset chameleon --iterations 32 --dr 0.0001 --dr-prop 0.0001 --dropout 0.7 --add-dropout 0 --lambd 1.0 --num-layers 2

Datasets

Running main.py will automatically download the designated datasets from PyG.
The codes to load the filtered Chameleon and Squirrel datasets, proposed by Platonov et al. (2023, ICLR), are in filtered_dataset.py.
The loading and preprocessing code for each dataset is in utils.py.

Requirements

dgl==1.1.1
dgl_cu113==0.9.1.post1
numpy==1.21.2
torch==1.11.0+cu113
torch_geometric==2.1.0
torch_scatter==2.0.9
torch_sparse==0.6.13
tqdm==4.62.3

Bibtex

@inproceedings{lee2023towards,
  title={Towards Deep Attention in Graph Neural Networks: Problems and Remedies},
  author={Lee, Soo Yong and Bu, Fanchen and Yoo, Jaemin and Shin, Kijung},
  booktitle={International Conference on Machine Learning},
  year={2023},
  organization={PMLR}
}

Contacts

For any question, please email us ({syleetolow, boqvezen97, kijungs}@kaist.ac.kr, {jaeminyoo}@cmu.edu)!

About

The code repository of "Towards Deep Attention in Graph Neural Networks: Problems and Remedies," published in ICML 2023.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published