Skip to content

Pytorch Repo for DeepGCNs (ICCV'2019 Oral, TPAMI'2021), DeeperGCN (arXiv'2020) and GNN1000(ICML'2021): https://www.deepgcns.org

License

Notifications You must be signed in to change notification settings

dataflowr/Projetc-deep_gcns_torch

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DL-DIY potential project ideas

  • read and understand the ICML paper
  • run the experiment on ogb-proteins (might be too long) or on another smaller dataset.

DeepGCNs: Can GCNs Go as Deep as CNNs?

In this work, we present new ways to successfully train very deep GCNs. We borrow concepts from CNNs, mainly residual/dense connections and dilated convolutions, and adapt them to GCN architectures. Through extensive experiments, we show the positive effect of these deep GCN frameworks.

[Project] [Paper] [Slides] [Tensorflow Code] [Pytorch Code]

Overview

We do extensive experiments to show how different components (#Layers, #Filters, #Nearest Neighbors, Dilation, etc.) effect DeepGCNs. We also provide ablation studies on different type of Deep GCNs (MRGCN, EdgeConv, GraphSage and GIN).

How to train, test and evaluate our models

Please look the details in Readme.md of each task inside examples folder. All the information of code, data, and pretrained models can be found there.

Recommended Requirements

Install enviroment by runing:

source deepgcn_env_install.sh

Code Architecture

.
├── misc                    # Misc images
├── utils                   # Common useful modules
├── gcn_lib                 # gcn library
│   ├── dense               # gcn library for dense data (B x C x N x 1)
│   └── sparse              # gcn library for sparse data (N x C)
├── eff_gcn_modules         # modules for mem efficient gnns
├── examples 
│   ├── modelnet_cls        # code for point clouds classification on ModelNet40
│   ├── sem_seg_dense       # code for point clouds semantic segmentation on S3DIS (data type: dense)
│   ├── sem_seg_sparse      # code for point clouds semantic segmentation on S3DIS (data type: sparse)
│   ├── part_sem_seg        # code for part segmentation on PartNet
│   ├── ppi                 # code for node classification on PPI dataset
│   └── ogb                 # code for node/graph property prediction on OGB datasets
│   └── ogb_eff             # code for node/graph property prediction on OGB datasets with memory efficient GNNs
└── ...

Citation

Please cite our paper if you find anything helpful,

@InProceedings{li2019deepgcns,
    title={DeepGCNs: Can GCNs Go as Deep as CNNs?},
    author={Guohao Li and Matthias Müller and Ali Thabet and Bernard Ghanem},
    booktitle={The IEEE International Conference on Computer Vision (ICCV)},
    year={2019}
}
@article{li2021deepgcns_pami,
  title={Deepgcns: Making gcns go as deep as cnns},
  author={Li, Guohao and M{\"u}ller, Matthias and Qian, Guocheng and Perez, Itzel Carolina Delgadillo and Abualshour, Abdulellah and Thabet, Ali Kassem and Ghanem, Bernard},
  journal={IEEE Transactions on Pattern Analysis and Machine Intelligence},
  year={2021},
  publisher={IEEE}
}
@misc{li2020deepergcn,
    title={DeeperGCN: All You Need to Train Deeper GCNs},
    author={Guohao Li and Chenxin Xiong and Ali Thabet and Bernard Ghanem},
    year={2020},
    eprint={2006.07739},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}
@InProceedings{li2021gnn1000,
    title={Training Graph Neural Networks with 1000 layers},
    author={Guohao Li and Matthias Müller and Bernard Ghanem and Vladlen Koltun},
    booktitle={International Conference on Machine Learning (ICML)},
    year={2021}
}

License

MIT License

Contact

For more information please contact Guohao Li, Matthias Muller, Guocheng Qian.

About

Pytorch Repo for DeepGCNs (ICCV'2019 Oral, TPAMI'2021), DeeperGCN (arXiv'2020) and GNN1000(ICML'2021): https://www.deepgcns.org

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 98.2%
  • Shell 1.8%