Simplified multilayer graph convolutional networks with dropout

被引:0
作者
Fei Yang
Huyin Zhang
Shiming Tao
机构
[1] Wuhan University,School of Computer Science
[2] Key Laboratory of Urban Land Resources Monitoring and Simulation,undefined
[3] Ministry of Natural Resources,undefined
来源
Applied Intelligence | 2022年 / 52卷
关键词
Graph convolutional networks; Multilayer; Dropout; Feature augmentation;
D O I
暂无
中图分类号
学科分类号
摘要
Graph convolutional networks (GCNs) and their variants are excellent deep learning methods for graph-structured data. Moreover, multilayer GCNs can perform feature smoothing repeatedly, which creates considerable performance improvements. However, they may inherit unnecessary complexity and redundant computation; to make matters worse, they introduce overfitting as the number of layers increases. In this paper, we present simplified multilayer graph convolutional networks with dropout (DGCs), novel neural network architectures that successively perform nonlinearity removal and weight matrix merging between graph conventional layers, leveraging a dropout layer to achieve feature augmentation and effectively reduce overfitting. Under such circumstances, first, we extend a shallow GCN to a multilayer GCN. Then, we reduce the complexity and redundant calculations of the multilayer GCN, while improving its classification performance. Finally, we make DGCs readily applicable to inductive and transductive tasks. Extensive experiments on citation networks and social networks offer evidence that the proposed model matches or outperforms state-of-the-art methods.
引用
收藏
页码:4776 / 4791
页数:15
相关论文
共 63 条
  • [1] Scarselli F(2009)The graph neural network model IEEE Trans Neural Netw 20 61-80
  • [2] Gori M(2021)Radial graph convolutional network for visual question generation IEEE Trans Neural Netw Learn Syst 32 1654-1667
  • [3] Chung Tsoi A(2020)Word-character graph convolution network for chinese named entity recognition IEEE/ACM Trans Audio Speech Lang Process 28 1520-1532
  • [4] Hagenbuchner M(2014)Dropout: A Simple Way to Prevent Neural Networks from Overfitting J Mach Learn Res 15 1929-1958
  • [5] Monfardini G(2019)Regularization of deep neural networks with spectral dropout Neural Netw 110 82-90
  • [6] Xu X(2019)Graph reduction with spectral and cut guarantees J Mach Learn Res 20 1-42
  • [7] Wang T(2021)Using statistical measures and machine learning for graph reduction to solve maximum weight clique problems IEEE Trans Pattern Anal Mach Intell 43 1746-1760
  • [8] Yang Y(2020)A continuation method for graph matching based feature correspondence IEEE Trans Pattern Anal Mach Intell 42 1809-1822
  • [9] Hanjalic A(2020)Dual L1-normalized context aware tensor power iteration and its applications to multi-object tracking and multi-graph matching Int J Comput Vis 128 360-392
  • [10] Shen HT(2019)AgentGraph: toward universal dialogue management with structured deep reinforcement learning IEEE/ACM Trans Audio Speech Lang Process 27 1378-1391