CenGCN: Centralized Convolutional Networks with Vertex Imbalance for Scale-Free Graphs

被引:25
作者
Xia, Feng [1 ]
Wang, Lei [2 ]
Tang, Tao [1 ]
Chen, Xin [2 ]
Kong, Xiangjie [3 ]
Oatley, Giles [1 ]
King, Irwin [4 ]
机构
[1] Federat Univ Australia, Sch Engn IT & Phys Sci, Ballarat, Vic 3353, Australia
[2] Dalian Univ Technol, Sch Software, Dalian 116620, Peoples R China
[3] Zhejiang Univ Technol, Coll Comp Sci & Technol, Hangzhou 310023, Peoples R China
[4] Chinese Univ Hong Kong, Dept Comp Sci & Engn, Hong Kong, Peoples R China
基金
中国国家自然科学基金;
关键词
Convolution; Weight measurement; Social networking (online); Laplace equations; Indexes; Data mining; Computer science; Graph convolutional networks; vertex centrality; network analysis; graph learning; representation learning; DISTRIBUTIONS;
D O I
10.1109/TKDE.2022.3149888
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph Convolutional Networks (GCNs) have achieved impressive performance in a wide variety of areas, attracting considerable attention. The core step of GCNs is the information-passing framework that considers all information from neighbors to the central vertex to be equally important. Such equal importance, however, is inadequate for scale-free networks, where hub vertices propagate more dominant information due to vertex imbalance. In this paper, we propose a novel centrality-based framework named CenGCN to address the inequality of information. This framework first quantifies the similarity between hub vertices and their neighbors by label propagation with hub vertices. Based on this similarity and centrality indices, the framework transforms the graph by increasing or decreasing the weights of edges connecting hub vertices and adding self-connections to vertices. In each non-output layer of the GCN, this framework uses a hub attention mechanism to assign new weights to connected non-hub vertices based on their common information with hub vertices. We present two variants CenGCN_D and CenGCN_E, based on degree centrality and eigenvector centrality, respectively. We also conduct comprehensive experiments, including vertex classification, link prediction, vertex clustering, and network visualization. The results demonstrate that the two variants significantly outperform state-of-the-art baselines.
引用
收藏
页码:4555 / 4569
页数:15
相关论文
共 53 条
[1]  
[Anonymous], 2014, 2 INT C LEARNING REP
[2]   FACTORING AND WEIGHTING APPROACHES TO STATUS SCORES AND CLIQUE IDENTIFICATION [J].
BONACICH, P .
JOURNAL OF MATHEMATICAL SOCIOLOGY, 1972, 2 (01) :113-120
[3]   Centrality and network flow [J].
Borgatti, SP .
SOCIAL NETWORKS, 2005, 27 (01) :55-71
[4]   Geometric Deep Learning Going beyond Euclidean data [J].
Bronstein, Michael M. ;
Bruna, Joan ;
LeCun, Yann ;
Szlam, Arthur ;
Vandergheynst, Pierre .
IEEE SIGNAL PROCESSING MAGAZINE, 2017, 34 (04) :18-42
[5]   A Comprehensive Survey of Graph Embedding: Problems, Techniques, and Applications [J].
Cai, HongYun ;
Zheng, Vincent W. ;
Chang, Kevin Chen-Chuan .
IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2018, 30 (09) :1616-1637
[6]   Power-Law Distributions in Empirical Data [J].
Clauset, Aaron ;
Shalizi, Cosma Rohilla ;
Newman, M. E. J. .
SIAM REVIEW, 2009, 51 (04) :661-703
[7]   A Survey on Network Embedding [J].
Cui, Peng ;
Wang, Xiao ;
Pei, Jian ;
Zhu, Wenwu .
IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2019, 31 (05) :833-852
[8]  
Defferrard M, 2016, ADV NEUR IN, V29
[9]   Normalized Mutual Information Feature Selection [J].
Estevez, Pablo. A. ;
Tesmer, Michel ;
Perez, Claudio A. ;
Zurada, Jacek A. .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2009, 20 (02) :189-201
[10]  
Faloutsos M, 1999, COMP COMM R, V29, P251, DOI 10.1145/316194.316229