Graph Information Vanishing Phenomenon in Implicit Graph Neural Networks

被引:0
|
作者
He, Silu [1 ]
Cao, Jun [1 ]
Yuan, Hongyuan [1 ]
Chen, Zhe [1 ]
Gao, Shijuan [1 ,2 ]
Li, Haifeng [1 ]
机构
[1] Cent South Univ, Sch Geosci & Info Phys, Changsha 410083, Peoples R China
[2] Cent South Univ, Informat & Network Ctr, Changsha 410083, Peoples R China
基金
中国国家自然科学基金;
关键词
graph neural network; graph information; joint training; graph curvature; 68-XX; CONVOLUTIONAL NETWORKS; RICCI CURVATURE;
D O I
10.3390/math12172659
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Graph neural networks (GNNs) have been highly successful in graph representation learning. The goal of GNNs is to enrich node representations by aggregating information from neighboring nodes. Much work has attempted to improve the quality of aggregation by introducing a variety of graph information with representational capabilities. The class of GNNs that improves the quality of aggregation by encoding graph information with representational capabilities into the weights of neighboring nodes through different learnable transformation structures (LTSs) are referred to as implicit GNNs. However, we argue that LTSs only transform graph information into the weights of neighboring nodes in the direction that minimizes the loss function during the learning process and does not actually utilize the effective properties of graph information, a phenomenon that we refer to as graph information vanishing (GIV). To validate this point, we perform thousands of experiments on seven node classification benchmark datasets. We first replace the graph information utilized by five implicit GNNs with random values and surprisingly observe that the variation range of accuracies is less than +/- 0.3%. Then, we quantitatively characterize the similarity of the weights generated from graph information and random values by cosine similarity, and the cosine similarities are greater than 0.99. The empirical experiments show that graph information is equivalent to initializing the input of LTSs. We believe that graph information as an additional supervised signal to constrain the training of GNNs can effectively solve GIV. Here, we propose GinfoNN, which utilizes both labels and discrete graph curvature as supervised signals to jointly constrain the training of the model. The experimental results show that the classification accuracies of GinfoNN improve by two percentage points over baselines on large and dense datasets.
引用
收藏
页数:19
相关论文
共 50 条
  • [31] Towards Bayesian Learning of the Architecture, Graph and Parameters for Graph Neural Networks
    Valkanas, Antonios
    Panzini, Andre-Walter
    Coates, Mark
    2022 56TH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS, AND COMPUTERS, 2022, : 852 - 856
  • [32] Visualizing Graph Neural Networks With CorGIE: Corresponding a Graph to Its Embedding
    Liu, Zipeng
    Wang, Yang
    Bernard, Juergen
    Munzner, Tamara
    IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, 2022, 28 (06) : 2500 - 2516
  • [33] Graph Neural Networks With Adaptive Structures
    Zhang, Zepeng
    Lu, Songtao
    Huang, Zengfeng
    Zhao, Ziping
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2025, 19 (01) : 181 - 194
  • [34] Revisiting Attention-Based Graph Neural Networks for Graph Classification
    Tao, Ye
    Li, Ying
    Wu, Zhonghai
    PARALLEL PROBLEM SOLVING FROM NATURE - PPSN XVII, PPSN 2022, PT I, 2022, 13398 : 442 - 458
  • [35] Graph Neural Networks for Recommender System
    Gao, Chen
    Wang, Xiang
    He, Xiangnan
    Li, Yong
    WSDM'22: PROCEEDINGS OF THE FIFTEENTH ACM INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING, 2022, : 1623 - 1625
  • [36] Implementation aspects of Graph Neural Networks
    Barcz, A.
    Szymanski, Z.
    Jankowski, S.
    PHOTONICS APPLICATIONS IN ASTRONOMY, COMMUNICATIONS, INDUSTRY, AND HIGH-ENERGY PHYSICS EXPERIMENTS 2013, 2013, 8903
  • [37] Bipartite Graph Coarsening for Text Classification Using Graph Neural Networks
    dos Santos, Nicolas Roque
    Minatel, Diego
    Baria Valejo, Alan Demetrius
    Lopes, Alneu de A.
    PROGRESS IN PATTERN RECOGNITION, IMAGE ANALYSIS, COMPUTER VISION, AND APPLICATIONS, CIARP 2023, PT I, 2024, 14469 : 589 - 604
  • [38] Polynomial-based graph convolutional neural networks for graph classification
    Pasa, Luca
    Navarin, Nicolo
    Sperduti, Alessandro
    MACHINE LEARNING, 2022, 111 (04) : 1205 - 1237
  • [39] Combine temporal information in session-based recommendation with graph neural networks
    Chen, Quanzhen
    Jiang, Feng
    Guo, Xuyao
    Chen, Jin
    Sha, Kaiyue
    Wang, Yuxuan
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 238
  • [40] Pre-Train and Learn: Preserving Global Information for Graph Neural Networks
    Zhu, Dan-Hao
    Dai, Xin-Yu
    Chen, Jia-Jun
    JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY, 2021, 36 (06) : 1420 - 1430