Are Graph Convolutional Networks With Random Weights Feasible?

被引:62
|
作者
Huang, Changqin [1 ]
Li, Ming [1 ]
Cao, Feilong [2 ]
Fujita, Hamido [3 ,4 ,5 ]
Li, Zhao [6 ]
Wu, Xindong [7 ]
机构
[1] Zhejiang Normal Univ, Key Lab Intelligent Educ Technol & Applicat Zhejia, Jinhua 321017, Peoples R China
[2] China Jiliang Univ, Coll Sci, Hangzhou 314423, Peoples R China
[3] HUTECH Univ, Ho Chi Minh City 70000, Vietnam
[4] Univ Granada, Andalusian Res Inst Data Sci & Computat Intelligen, Granada 18011, Spain
[5] Iwate Prefectural Univ, Reg Res Ctr, Iwate 0200693, Japan
[6] Link2Do Technol Ltd, Hangzhou 310027, Peoples R China
[7] Hefei Univ Technol, Key Lab Knowledge Engn Big Data, Minist Educ China, Hefei 230002, Peoples R China
基金
中国国家自然科学基金;
关键词
Training; Analytical models; Upper bound; Stability analysis; Neural networks; Convolution; Convolutional neural networks; Graph convolutional networks; random weights; stability and generalization; approximation upper bound; NEURAL-NETWORKS; APPROXIMATION; STABILITY; ALGORITHM; INSIGHTS;
D O I
10.1109/TPAMI.2022.3183143
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph Convolutional Networks (GCNs), as a prominent example of graph neural networks, are receiving extensive attention for their powerful capability in learning node representations on graphs. There are various extensions, either in sampling and/or node feature aggregation, to further improve GCNs' performance, scalability and applicability in various domains. Still, there is room for further improvements on learning efficiency because performing batch gradient descent using the full dataset for every training iteration, as unavoidable for training (vanilla) GCNs, is not a viable option for large graphs. The good potential of random features in speeding up the training phase in large-scale problems motivates us to consider carefully whether GCNs with random weights are feasible. To investigate theoretically and empirically this issue, we propose a novel model termed Graph Convolutional Networks with Random Weights (GCN-RW) by revising the convolutional layer with random filters and simultaneously adjusting the learning objective with regularized least squares loss. Theoretical analyses on the model's approximation upper bound, structure complexity, stability and generalization, are provided with rigorous mathematical proofs. The effectiveness and efficiency of GCN-RW are verified on semi-supervised node classification task with several benchmark datasets. Experimental results demonstrate that, in comparison with some state-of-the-art approaches, GCN-RW can achieve better or matched accuracies with less training time cost.
引用
收藏
页码:2751 / 2768
页数:18
相关论文
共 50 条
  • [21] Disentangled Graph Convolutional Networks
    Ma, Jianxin
    Cui, Peng
    Kuang, Kun
    Wang, Xin
    Zhu, Wenwu
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [22] Signed Graph Convolutional Networks
    Derr, Tyler
    Ma, Yao
    Tang, Jiliang
    2018 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2018, : 929 - 934
  • [23] Universal Graph Convolutional Networks
    Jin, Di
    Yu, Zhizhi
    Huo, Cuiying
    Wang, Rui
    Wang, Xiao
    He, Dongxiao
    Han, Jiawei
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021,
  • [24] Graph Anomaly Detection with Graph Convolutional Networks
    Mir, Aabid A.
    Zuhairi, Megat F.
    Musa, Shahrulniza
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2023, 14 (11) : 601 - 613
  • [25] Simplifying Graph Convolutional Networks
    Wu, Felix
    Zhang, Tianyi
    de Souza, Amauri Holanda, Jr.
    Fifty, Christopher
    Yu, Tao
    Weinberger, Kilian Q.
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [26] Convolutional Graph Neural Networks
    Gama, Fernando
    Marques, Antonio G.
    Leus, Geert
    Ribeiro, Alejandro
    CONFERENCE RECORD OF THE 2019 FIFTY-THIRD ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS, 2019, : 452 - 456
  • [27] Contrastive Graph Learning with Graph Convolutional Networks
    Nagendar, G.
    Sitaram, Ramachandrula
    DOCUMENT ANALYSIS SYSTEMS, DAS 2022, 2022, 13237 : 96 - 110
  • [28] Personalized Driver Gene Prediction Using Graph Convolutional Networks with Conditional Random Fields
    Wei, Pi-Jing
    Zhu, An-Dong
    Cao, Ruifen
    Zheng, Chunhou
    BIOLOGY-BASEL, 2024, 13 (03):
  • [29] A Convolutional Accelerator for Neural Networks With Binary Weights
    Ardakani, Arash
    Condo, Carlo
    Gross, Warren J.
    2018 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS), 2018,
  • [30] A review on neural networks with random weights
    Cao, Weipeng
    Wang, Xizhao
    Ming, Zhong
    Gao, Jinzhu
    NEUROCOMPUTING, 2018, 275 : 278 - 287