来源:https://github.com/dmlc/dgl/tree/master/examples/pytorch/hgt
本人在其基础上添加了一些注释。值得一提的是,该代码的实现与原论文的数学公式相比似乎更加简洁,有所不同。
请见model.py
中的HGTLayer
类的实现。
运行指令为python train_acm.py
。如果系统不为Windows,运行前需要修改train_acm.py
中的data_file_path
。
以下README原文。
Alternative PyTorch-Geometric implementation
“Heterogeneous Graph Transformer” is a graph neural network architecture that can deal with large-scale heterogeneous and dynamic graphs.
This toy experiment is based on DGL's official tutorial. As the ACM datasets doesn't have input feature, we simply randomly assign features for each node. Such process can be simply replaced by any prepared features.
The reference performance against R-GCN and MLP running 5 times:
Model | Test Accuracy | # Parameter |
---|---|---|
2-layer HGT | 0.465 ± 0.007 | 2,176,324 |
2-layer RGCN | 0.392 ± 0.013 | 416,340 |
MLP | 0.132 ± 0.003 | 200,974 |