- Deep Learning on Graphs: A Survey
- Graph Neural Networks: A Review of Methods and Applications
- A Comprehensive Survey on Graph Neural Networks
https://github.com/sungyongs/graph-based-nn
https://github.com/thunlp/GNNPapers
Spatio-temporal modeling 论文列表(主要是graph convolution相关)
https://mp.weixin.qq.com/s/xgf7A3GFh1cIM2QhaCyyoA
Literature of Deep Learning for Graphs
- Deep Learning on Graphs: A Survey
[新智元解读] - Graph Neural Networks: A Review of Methods and Applications
[新智元解读] - A Comprehensive Survey on Graph Neural Networks
- Relational inductive biases, deep learning, and graph networks
- Geometric Deep Learning: Going beyond Euclidean data
- Computational Capabilities of Graph Neural Networks
- Neural Message Passing for Quantum Chemistry
- Non-local Neural Networks
- The Graph Neural Network Model
- geometric learning library [github] in PyTorch named PyTorch Geometric, which implements serveral graph neural networks including ChebNet, 1stChebNet, GraphSage, MPNNs, GAT and SplineCNN.
- Deep Graph Library (DGL) [website] [github] provides a fast implementation of many graph neural networks with a set of functions on top of popular deep learning platforms such as PyTorch and MXNet
- graph_nets [github]
- pytorch_geometric速度快 https://github.com/rusty1s/pytorch_geometric
- 以下四篇是按照时间轴,依次在前一篇的文章上进行改进的 ★
- The Emerging Field of Signal Processing on Graphs
- Spectral Networks and Locally Connected Networks on Graphs
- Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering, [PyTorch Code] [TF Code]
- Semi-Supervised Classification with Graph Convolutional Networks, [Code], [Blog]
- 以下三篇是在"A Comprehensive Survey on Graph Neural Networks"这篇综述中提到的另外三篇
- Deep convolutional networks on graph-structured data
- Adaptive graph convolutional neural networks (AAAI 2018) 可接受任意图结构和规模的图作为输入
- Cayleynets: Graph convolutional neural networks with complex rational spectral filters
- 谱上的图卷积网络的缺陷:
- by "A Comprehensive Survey on Graph Neural Networks
- spectral methods usually handle the whole graph simultaneously and are difficult to parallel or scale to large graphs (P2)
- more drawbacks: -- (P7 4.1.3 summary of spectral methods)
- 任何对graph的扰动都可以导致特征基U(特征向量)的扰动
- 可学习的filter是与domain相关的,不能应用于不同的graph structure
- 特征值分解需要很大的计算量和存储量
- 虽然ChebNet and 1stChebNet定义的过滤器在空间上的局部的,且在graph上的任意位置(node)是共享的,但是这两个模型都需要载入整个graph进行graph convolution的计算,在处理big graph上计算效率低:by yaya: X'=AXW, X'的更新, 需要输入整个X才可以计算得到
- by "A Comprehensive Survey on Graph Neural Networks"
- Graph Attention Network (GAT)(ICLR 2017) [tf code]
- Inductive representation learning on large graphs(GraphSAGE) [tf code] ★
Instead of updating states over all nodes, GraphSage proposes a batch-training algorithm(sub-graph training)which improves scalability for large graphs. The learning process: P9 in "A Comprehensive Survey on Graph Neural Networks" - Neural Message Passing for Quantum Chemistry (MPNNs)
- Learning convolutional neural networks for graphs (PATCHY-SAN)
- Geometric deep learning on graphs and manifolds using mixture model cnns
- Learning convolutional neural networks for graphs
- Large-scale learnable graph convolutional networks (LGCN) [tf code]
- Diffusion-convolutional neural networks (NeurIPS 2016) [tf code] ★
- Geometric deep learning on graphs and manifolds using mixture model cnns (CVPR 2017)
- etc: by "A Comprehensive Survey on Graph Neural Networks" P5;P7的表格分别列举了一些spatial-based GCN
- Together with sampling strategies, the computation can be performed in a batch of nodes instead of the whole graph (GraphSAGE and LGCN)
- by "A Comprehensive Survey on Graph Neural Networks"
- bridges: The graph convolution defined by 1stChebNet(semi-supervised GCN) is localized in space. It bridges the gap between spectral-based methods and spatial-based methods. -- (P2)
- Drawbacks to spectral based models. We illustrate this in the following from three aspects, efficiency, generality and flexibility. -- (P11)
- Efficiency
基于谱的模型 或者需要计算特征向量,或者需要同时处理整个graph,这样的情况下,模型的计算量将随着graph size 显著的增加
基于空间的模型 通过聚合临近节点的特征,直接在graph domain进行卷积计算,因此具有处理large graph的潜力。另外,可以以批次处理节点,而不是整个graph。再另外,随着临近节点的增加,可以使用采样策略来提高效率----参见后文 改善GCN在训练方面的缺陷: Training Methods - Generality
基于谱的模型 假设在固定的graph上进行训练,很难泛化到其他的新的或者不同的graph上
基于空间的模型 以node为单位, 执行graph convolution计算,因此训练得到的权重(weights)可以轻易的共享到其他的node或者graph - Flexibility
基于谱的模型 受限于无向图,但是却没有在有向图上的关于拉普拉斯矩阵(Laplacian matrix)清晰的定义。因此,若将基于谱的方法应用在有向图上,需要先将有向图转化为无向图
基于空间的模型 处理多源输入更加灵活,这里的多源输入可以指:edge features or edge directions, etc
关于edge features, 参见下文 输入含有边特征的GNN:input allow edge features
- by "A Comprehensive Survey on Graph Neural Networks"
- Comparison Between Spectral and Spatial Models -- (P11)
- 1stChebNet(semi-supervised GCN):the main drawback of 1stChebNet is that the computation cost increases exponentially with the increase of the number of 1stChebNet layers during batch training. Each node in the last layer has to expand its neighborhood recursively across previous layers.
- Fastgcn: fast learning with graph convolutional networks via importance sampling (ICLR 2018)
assume the rescaled adjacent matrix A comes from a sampling distribution. - Stochastic training of graph convolutional networks with variance reduction (ICML 2018)
reduce the receptive field size of the graph convolution to an arbitrary small scale by sampling neighborhoods and using historical hidden representations. - Adaptive sampling towards fast graph representation learning (NeurIPS 2018)
propose an adaptive layer-wise sampling approach to accelerate the training of 1stChebNet, where sampling for the lower layer is conditioned on the top one.
- by "Graph Neural Networks: A Review of Methods and Applications"
- Training Methods -- (P9)
- GCN requires the full graph Laplacian, which is computational-consuming for large graphs. Furthermore, The embedding of a node at layer L is computed recursively by the embeddings of all its neighbors at layer L − 1. Therefore, the receptive field of a single node grows exponentially with respect to the number of layers, so computing gradient for a single node costs a lot. Finally, GCN is trained independently for a fixed graph, which lacks the ability for inductive learning.
- Inductive representation learning on large graphs (NeurIPS 2017)
- Fastgcn: fast learning with graph convolutional networks via importance sampling (ICLR 2018)
directly samples the receptive field for each layer. - Adaptive sampling towards fast graph representation learning (NeurIPS 2018)
introduces a parameterized and trainable sampler to perform layerwise sampling conditioned on the former layer. - Stochastic training of graph convolutional networks with variance reduction (ICML 2018)
proposed a control-variate based stochastic approximation algorithms for GCN by utilizing the historical activations of nodes as a control variate. - Deeper insights into graph convolutional networks for semi-supervised learning (arXiv:1801.07606, 2018)
- by "Deep Learning on Graphs: A Survey"
- Accelerating by Sampling -- (P8)
- Inductive representation learning on large graphs (NeurIPS 2017)
- Graph convolutional neural networks for web-scale recommender systems
- Fastgcn: fast learning with graph convolutional networks via importance sampling (ICLR 2018)
- Stochastic training of graph convolutional networks with variance reduction (ICML 2018)
- by "A Comprehensive Survey on Graph Neural Networks"
- Graph Attention Network (GAT)(ICLR 2017) [tf code] ★
- Gaan:Gated attention networks for learning on large and spatiotemporal graphs
- Graph classification using structural attention(ACM SIGKDD 2018)
- Watch your step: Learning node embeddings via graph attention(NeurIPS 2018)
- by "Graph Neural Networks: A Review of Methods and Applications"
- Gated graph sequence neural networks (arXiv 2016)
- Improved semantic representations from tree-structured long short-term memory networks (IJCNLP 2015)
- Conversation modeling on reddit using a graph-structured lstm (TACL 2018)
- Sentence-state lstm for text representation (ACL 2018)
- Semantic object parsing with graph lstm (ECCV 2016)
- by yaya:考虑到CNN中residual network增加网络层数,使得性能的提升,这里尝试使用residual 也是为了在增加网络层数的基础上,使得性能更好。参见下文:Go deeper?
- by "Deep Learning on Graphs: A Survey" -- (P7 Residual and Jumping Connections)
- Semi-supervised classification with graph convolutional networks (ICLR 2017)
- Column networks for collective classification (AAAI 2017)
- Representation learning on graphs with jumping knowledge networks (ICML 2018)
- by "Graph Neural Networks: A Review of Methods and Applications" -- P9 Skip Connections
- Semi-supervised user geolocation via graph convolutional networks (ACL 2018)
- Representation learning on graphs with jumping knowledge networks (ICML 2018)
- by yaya
- Can GCNs Go as Deep as CNNs? (CVPR 2019) *
- by "A Comprehensive Survey on Graph Neural Networks"
- network embedding算法可以分类为:1.matrix factorization 2.random walks 3. deep learning. Graph Auto-encoders是deep learning的一类方法. -- (P2)
- Network embedding是为了将node embedding 转化到低维的向量空间,通过保存网络的拓扑结构与节点内容信息,接下来的graph分析任务(比如,分类,聚类和推荐等)可以被应用于现有的机器学习任务(如SVM for classification)
- Variational graph auto-encoders (GAE) [tkipf/code] [tf code]
used in link prediction task in citation networks
encoder对node embedding进行更新,decoder对A(adjacency matrix)进行更新 - Adversarially regularized graph autoencoder for graph embedding (ARGA) [tf code]
- Learning deep network representations with adversarially regularized autoencoders (NetRA)
- Deep neural networks for learning graph representations (DNGR) [matlab code]
- Structural deep network embedding (SDNE) [python code]
- Deep recursive network embedding with regular equivalence (DRNE)(https://github.com/tadpole/DRNE)
- by "A Comprehensive Survey on Graph Neural Networks"
- factor the generation process as forming nodes and edges alternatively
- Graphrnn: A deep generative model for graphs (ICML2018) [tf code]
- Learning deep generative models of graphs (ICML2018)
- employ generative adversarial training
- Molgan: An implicit generative model for small molecular graphs (arXiv:1805.11973 2018)
- Net-gan: Generating graphs via random walks (ICML2018)
- by "A Comprehensive Survey on Graph Neural Networks"
- Diffusion convolutional recurrent neural network: Data-driven traffic forecasting (DCRNN) (ICLR 2018)
运用图卷积的思想提出了DCRNN来进行时间和空间上的交通流预测,并达到了很好的效果 - Spatio-temporal graph convolutional networks: A deep learning framework for traffic forecasting (CNN-GNN) (IJCAI 2017) [tf code
- Spatial temporal graph convolutional networks for skeleton-based action recognition (ST-GCN) (AAAI 2018) [pytorch code]
- Structural-rnn:Deep learning on spatio-temporal graphs (Structural-RNN) (CVPR 2016) [theano code]
- by yaya
- 这两篇都是skeleton-based action recognition ★
- Skeleton-Based Action Recognition with Spatial Reasoning and Temporal Stack Learning (ECCV 2018)
- Spatial temporal graph convolutional networks for skeleton-based action recognition (ST-GCN) (AAAI 2018) [pytorch code]
- by "Deep Learning on Graphs: A Survey"
- Graphrnn: Generating realistic graphs with deep auto-regressive models (ICML 2018)
- Dynamic graph neural networks (arXiv preprint 2018)
- Geometric matrix completion with recurrent multi-graph neural networks (NeurIPS 2017)
- Dynamic graph convolutional networks (arXiv preprint 2017)
Dynamic GCN applies LSTM to gather results of GCNs of different time slices in dynamic networks, aiming to capture both spatio and temporal graph information.
- by yaya
- Graph2Seq: Graph to Sequence Learning with Attention-based Neural Networks
- Structured Sequence Modeling with Graph Convolutional Recurrent Networks
- by "Deep Learning on Graphs: A Survey"
- Graph convolutional policy network for goal-directed molecular graph generation (NeurIPS 2018)
- Molgan: An implicit generative model for small molecular graphs (arXiv preprint 2018)
- by "A Comprehensive Survey on Graph Neural Networks"
- The graph neural network model(GNN) (2009)
- Neural message passing for quantum chemistry(MPNN) (2017)
- Diffusion-convolutional neural networks(DCNN) (2016)
- Learning convolutional neural networks for graphs(PATCHY-SAN) (2016)
- by "Deep Learning on Graphs: A Survey"
- Geniepath:Graph neural networks with adaptive receptive paths
- Dual graph convolutional networks for graph-based semi-supervised classification
- Signed graph convolutional network
- by yaya ★
- Encoding Sentences with Graph Convolutional Networks for Semantic Role Labeling
- Exploring Visual Relationship for Image Captioning
- edge-labeling graoh neural network for few-shot learning (CVPR 2019)
- GCN-LASE: Towards Adequately Incorporating Link Attributes in Graph Convolutional Networks (IJCAI 2019)
- Representation Learning for Attributed Multiplex Heterogeneous Network (KDD 20196)
- Dynamic edgeconditioned filters in convolutional neural networks on graphs. (CVPR 2017)
- Dynamic graph cnn for learning on point clouds
Order invariance A critical requirement for the graph readout operation is that the operation should be invariant to the order of nodes, i.e. if we change the indices of nodes and edges using a bijective function between two vertex sets, representation of the whole graph should not change.
一. Statistics
- by "Deep Learning on Graphs: A Survey"
- The most basic operations that are order invariant are simple statistics like taking sum, average or max-pooling
- Convolutional networks on graphs for learning molecular fingerprints
- Diffusion-convolutional neural networks
- other
- Molecular graph convolutions: moving beyond fingerprints
- Spectral networks and locally connected networks on graphs
二. Hierarchical Clustering
- by "Deep Learning on Graphs: A Survey"
- Spectral networks and locally connected networks on graphs
- Deep convolutional networks on graph-structured data
- Hierarchical Graph Representation Learning with Differentiable Pooling [code] ★
三. Graph Pooling Modules
- by "A Comprehensive Survey on Graph Neural Networks"
- Convolutional neural networks on graphs with fast localized spectral filtering (NeurIPS 2016)
- Deep convolutional networks on graph-structured data
- An end-to-end deep learning architecture for graph classification (AAAI 2018) [code] [pytorch code]
- Hierarchical graph representation learning with differentiable pooling (NeurIPS 2018) [code]
- SFFAI分享 | 呼奋宇:深度层次化图卷积神经网络
- Hierarchical graph representation learning with differentiable pooling (NeurIPS 2018) [code]
- Overview
- by "Graph Neural Networks: A Review of Methods and Applications"
- by "A Comprehensive Survey on Graph Neural Networks"
- detect and recognize objects and predict semantic relationships between pairs of objects
- Scene graph generation by iterative message passing (CVPR 2017)
- Graph r-cnn for scene graph generation (ECCV 2018)
- Factorizable net: an efficient subgraph-based framework for scene graph generation (ECCV 2018)
- generating realistic images given scene graphs
- by "A Comprehensive Survey on Graph Neural Networks"
- Image generation from scene graphs (arXiv preprint, 2018)
- by "A Comprehensive Survey on Graph Neural Networks"
- Dynamic graph cnn for learning on point clouds(arXiv preprint 2018)
- Large-scale point cloud semantic segmentation with superpoint graphs (CVPR 2018)
- Rgcnn: Regularized graph cnn for point cloud segmentation (arXiv preprint 2018)
- by "A Comprehensive Survey on Graph Neural Networks"
- detects the locations of human joints in video clips
- Spatial temporal graph convolutional networks for skeleton-based action recognition (ST-GCN) (AAAI 2018) [pytorch code]
- Structural-rnn:Deep learning on spatio-temporal graphs (Structural-RNN) (CVPR 2016) [theano code]
- by yaya
- not use skeleton
- Deep learning on spatio-temporal graphs (CVPR 2016)
- Videos as Space-Time Region Graph (ECCV 2018)
- VideoGraph: Recognizing Minutes-Long Human Activities in Videos (arxiv 2019)
- by "Graph Neural Networks: A Review of Methods and Applications"
- Few-shot learning with graph neural networks (ICLR 2018) [code]
- Zero-shot recognition via semantic embeddings and knowledge graphs (CVPR 2018)
- Multi-label zero-shot learning with structured knowledge graphs (arXiv preprint 2017)
- Rethinking knowledge graph propagation for zero-shot learning(arXiv preprint 2018)
- The more you know: Using knowledge graphs for image classification (arXiv preprint 2016)
- by yaya
- by "A Comprehensive Survey on Graph Neural Networks"
- image classification
- Few-shot learning with graph neural networks (ICLR 2018) [code]
- 3d action recognition
- by yaya
- image classification
- Zero-shot recognition via semantic embeddings and knowledge graphs (CVPR 2018)
- Multi-label zero-shot learning with structured knowledge graphs (arXiv preprint 2017)
- Rethinking knowledge graph propagation for zero-shot learning(arXiv preprint 2018)
- by "A Comprehensive Survey on Graph Neural Networks"
- 3d graph neural networks for rgbd semantic segmentation (CVPR 2017)
- Syncspeccnn: Synchronized spectral cnn for 3d shape segmentation (CVPR 2017)
- by "Graph Neural Networks: A Review of Methods and Applications"
- Semantic object parsing with graph lstm (ECCV 2016)
- Interpretable structure-evolving lstm (CVPR 2017)
- Large-scale point cloud semantic segmentation with superpoint graphs(arXiv preprint 2017)
- Dynamic graph cnn for learning on point clouds(arXiv preprint 2018)
- 3d graph neural networks for rgbd semantic segmentation (CVPR 2017)
- by "Graph Neural Networks: A Review of Methods and Applications"
- A simple neural network module for relational reasoning. Adam Santoro, David Raposo, David G.T. Barrett, Mateusz Malinowski, Razvan Pascanu, Peter Battaglia, Timothy Lillicrap. NeurIPS 2017. paper
- Graph-Structured Representations for Visual Question Answering. Damien Teney, Lingqiao Liu, Anton van den Hengel. CVPR 2017. paper
- Out of the Box: Reasoning with Graph Convolution Nets for Factual Visual Question Answering. Medhini Narasimhan, Svetlana Lazebnik, Alexander Schwing. NeurIPS 2018. paper
- Learning Conditioned Graph Structures for Interpretable Visual Question Answering. Will Norcliffe-Brown, Efstathios Vafeias, Sarah Parisot. NeurIPS 2018. paper [code]
- Deep reasoning with knowledge graph for social relationship understanding.
- by "Graph Neural Networks: A Review of Methods and Applications"
- Relation networks for object detection (CVPR 2018)
- Learning region features for object detection (arXiv preprint 2018)
- by "Graph Neural Networks: A Review of Methods and Applications"
- [Learning humanobject interactions by graph parsing neural networks] (arXiv preprint 2018)
- Structural-rnn:Deep learning on spatio-temporal graphs (CVPR 2016)
- by "Graph Neural Networks: A Review of Methods and Applications"
- Iterative visual reasoning beyond convolutions (arXiv preprint 2018)
- Deep reasoning with knowledge graph for social relationship understanding
- Overview
- by "Graph Neural Networks: A Review of Methods and Applications"
- Graph Convolutional Encoders for Syntax-aware Neural Machine Translation
- Encoding Sentences with Graph Convolutional Networks for Semantic Role Labeling
- [官方code(theano 0.8.2,lasagne 0.1)] [复现pytorch] ★
- 专知讲解
- by yaya:阅读该篇文章主要是来源于这篇将图卷积用于图像描述的文章: Exploring Visual Relationship for Image Captioning
这两篇文章采用的图卷积公式都是一样的
- Semantic graph中的GCN公式,与X' = AXW 的形式是不一样的,仅能当成是一个节点特征的更新是由近邻节点的聚合特征得到-这样一般的gnn的形式,并不是像论文中说的相比于"semi-supervised gcn" more formally(又可能我没有读懂这个fromally的意思)
- 关于公式, Semantic graph首先由给定的句子通过StanfordCoreNLP生成句法依赖树,根据这个树,构建图
-解析公式: 首先node是句子中的每一个word, edge是句法依赖树生成的, edge是连接具有句法依赖的两个word, 同时, edge也有label (dependency label/syntactic functions), 如:'nsubj', 'advmod'等。如下图的一个例子,则公式中的W与相邻的节点有关, A与label of edge 有关
- by "Graph Neural Networks: A Review of Methods and Applications"
- Action recognition
- Non-local Neural Networks
- Nonlocal Neural Networks, Nonlocal Diffusion and Nonlocal Modeling(non-local 的升级版--未阅读--该文可能是没有具体application)
- Videos as Space-Time Region Graphs
- Few-shot image classification
- Few-Shot Learning with Graph Neural Networks
- Image captioning Exploring Visual Relationship for Image Captioning
- Semantic role labeling
- Encoding Sentences with Graph Convolutional Networks for Semantic Role Labeling
一. Go deeper?
- by "Graph Neural Networks: A Review of Methods and Applications" & "A Comprehensive Survey on Graph Neural Networks"
(1)当前的gnn的层数大都很浅,这是因为,随着网络层数的增加,representation of nodes将趋于平滑,换句话说,图卷积本质上是使相邻节点的表达更加接近,从而在理论上来说,在无限次卷积的情况下,所有节点的表达都将会收敛于一个稳定的点,节点特征的可区分性与信息的丰富性将会损失。在图结构数据上的网络增加层数是否是一个好的策略仍然是一个开放性的问题。[Deeper insights into graph convolutional networks for semi-supervised learning]
(2)when tack k layers, each node will aggregate more information from neighborhoods k hops away. 若临近节点有噪声,将会随着层数的增加,噪声信息也会指数级增加。 P9 by "Graph Neural Networks: A Review of Methods and Applications"--skip connection
受到传统deep neural networks在增加网络深度上取得的显著结果,一些研究者也尝试解决GNN中的网络层数难以加深的问题: - solution:
- by "A Comprehensive Survey on Graph Neural Networks"
- Gated graph sequence neural networks (arXiv 2016)
- Deeper insights into graph convolutional networks for semi-supervised learning (arXiv preprint 2018)
- by "Graph Neural Networks: A Review of Methods and Applications"
- Semi-supervised user geolocation via graph convolutional networks (ACL 2018)
- Representation learning on graphs with jumping knowledge networks (ICML 2018)
二. Non-structural Scenarios->generate graph from raw data
- by "Graph Neural Networks: A Review of Methods and Applications"
- 虽然上文讨论了graph在非结构化场景(image, text)中的应用,但是目前却没有从原始数据中来生成graph的最优的方法。in image domain, 一些工作利用CNN来获取特征映射,然后对其进行采样得到的超像素作为节点,其他的也有提取object作为节点。in test domain, 一些工作利用syntactic trees作为syntactic graphs, 另外其他的工作直接采用全连接graphs
因此找到最佳的graph generation approach将提供更广泛的领域,GNN可以在这些领域中做出贡献。
三. Dynamic graphs
- by "Deep Learning on Graphs: A Survey"
在社交网络中,存在新的人加入,或者已存在的人退出社交网络,这样的graph是动态的,而当前提出的方法都是建立在 static graph.
How to model the evolving characteristics of dynamic graphs and support incrementally updating model parameters largely remains open in the literature. - Some preliminary works try to tackle this problem using Graph RNN architectures with encouraging results
- solution:
- Dynamic graph neural networks (arXiv preprint 2018)
- Dynamic graph convolutional networks (arXiv preprint 2017)
四. Different types of graphs
- by "Deep Learning on Graphs: A Survey"
- solution:
- homogeneous graphs Heterogeneous network embedding via deep architectures
- Signed networks Signed graph convolutional network
- Hyper graphs Structural deep embedding for hyper-networks (AAAI 2018)
五. Interpretability
- by "Deep Learning on Graphs: A Survey"
- 由于graph经常与其他学科相关联,因此解释图形的深度学习模型对于决策问题至关重要,例如,在药物或者疾病相关的问题,可解释性对于将计算机实验转化为临床应用至关重要。然而,由于图中的节点和边缘是紧密相互关联的, 因此基于图形的深度学习的可解释性甚至比其他黑匣子模型更具挑战性
六. Compositionality
- by "Deep Learning on Graphs: A Survey"
- 很多存在的方法可以组合到一起,例如将GCN作为GAEs或者Graph RNNs里的layer, 除了设计新的building blocks,如何将现有的结构以某种原则组合到一起也是一个很有趣的方向,最近的工作,Graph Networks进行了尝试,重点介绍了GNNS和GCNS通用框架在关系推理问题中的应用。
七. Scalability->Can gnn handle large graphs?
-
by "Graph Neural Networks: A Review of Methods and Applications"
Scaling up GNN is difficult because many of the core steps are computational consuming in big data environment:1. graph不是规则的欧式空间,receptive filed(neighborhood structure) 对于每个node也是不同的,因此很难对节点进行批次训练. 2. 当处理 large graph时,计算graph Laplacian也很困难. -
by yaya 我觉得这样的说法是不对的,由上文的分析中可以看出,只是谱方法需要计算graph Laplacian
-
by "A Comprehensive Survey on Graph Neural Networks"
当gcn的堆叠多层时,一个节点的最终状态将由很多临近节点((1~k)-hop neighbors)的状态所决定,在反向传播时的计算量将会很大。当前为了提高模型的效率提出了两类方法fast sampling and sub-graph training, but still not scalable enough to handle deep architectures with large graphs -
solution:
fast sampling
- Fastgcn: fast learning with graph convolutional networks via importance sampling (ICLR 2018)
- Stochastic training of graph convolutional networks with variance reduction (ICML 2018)
sub-graph training - Inductive representation learning on large graphs (NeurIPS 2017)
- Large-scale learnable graph convolutional networks (ACM 2018)
-
by yaya: 我觉得这样说,是从deep gnn的角度来说,这样就没有讲清shallow gnn是否可以应用于large graph
-
yaya conclution: 基于公式X'=AXW的GCN网络,需要将entire graph输入网络中进行计算,不能以节点为单位进行batch运算,计算量大,对于设计了sub-graph的网络,局限性可能在于邻近节点很多,若网络也很深,计算量将也会很大
八. Receptive Field
- by "A Comprehensive Survey on Graph Neural Networks"
- 这里的Receptive Field是参考了论文"Deep Learning on Graphs: A Survey"中的Accelerating by Sampling这一节,目的也是在于加速训练
- 一个node的可接受域是指它本身以及its neighbors, But the number of neighbors is very different, from one to thousands. 遵循power law distribution. 因此采样策略被提出来,如何选择节点的有代表性的接收域仍有待探索
- solution:
- Inductive representation learning on large graphs (NeurIPS 2017)
- Learning convolutional neural networks for graphs (ICML 2016)
- Large-scale learnable graph convolutional networks (ACM SIGKDD 2018)
- 图神经网络评价的误区--Pitfalls of Graph Neural Network Evaluation
- 提出了一个理论框架来分析GNN的表征能力--How Powerful are Graph Neural Networks? (ICLR 2019)
专知解读