Skip to content

在DGL的HGT算法示例基础上添加中文注释,便于理解和改编

Notifications You must be signed in to change notification settings

ZhengliangDuanfang/HGT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 

Repository files navigation

来源:https://github.com/dmlc/dgl/tree/master/examples/pytorch/hgt

本人在其基础上添加了一些注释。值得一提的是,该代码的实现与原论文的数学公式相比似乎更加简洁,有所不同。 请见model.py中的HGTLayer类的实现。

运行指令为python train_acm.py。如果系统不为Windows,运行前需要修改train_acm.py中的data_file_path

以下README原文。


Heterogeneous Graph Transformer (HGT)

Alternative PyTorch-Geometric implementation

Heterogeneous Graph Transformer is a graph neural network architecture that can deal with large-scale heterogeneous and dynamic graphs.

This toy experiment is based on DGL's official tutorial. As the ACM datasets doesn't have input feature, we simply randomly assign features for each node. Such process can be simply replaced by any prepared features.

The reference performance against R-GCN and MLP running 5 times:

Model Test Accuracy # Parameter
2-layer HGT 0.465 ± 0.007 2,176,324
2-layer RGCN 0.392 ± 0.013 416,340
MLP 0.132 ± 0.003 200,974

About

在DGL的HGT算法示例基础上添加中文注释,便于理解和改编

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages