Hierarchy-Aware Adaptive Graph Neural Network

被引:1
作者
Wu, Dengsheng [1 ]
Wu, Huidong [2 ,3 ]
Li, Jianping [4 ]
机构
[1] Shenzhen Univ, Coll Management, Shenzhen 518060, Peoples R China
[2] Chinese Acad Sci, Inst Sci & Dev, Beijing 100190, Peoples R China
[3] Univ Chinese Acad Sci, Sch Publ Policy & Management, Beijing 100049, Peoples R China
[4] Univ Chinese Acad Sci, Sch Econ & Management, Beijing 100190, Peoples R China
基金
中国国家自然科学基金;
关键词
Message passing; Graph neural networks; Adaptive systems; Vectors; Roads; Convolution; Topology; Network topology; Electronic mail; Aggregates; Adaptive context; directed network; graph neural networks (GNNs); node hierarchy; representation learning;
D O I
10.1109/TKDE.2024.3485736
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph Neural Networks (GNNs) have gained attention for their ability in capturing node interactions to generate node representations. However, their performances are frequently restricted in real-world directed networks with natural hierarchical structures. Most current GNNs incorporate information from immediate neighbors or within predefined receptive fields, potentially overlooking long-range dependencies inherent in hierarchical structures. They also tend to neglect node adaptability, which varies based on their positions. To address these limitations, we propose a new model called Hierarchy-Aware Adaptive Graph Neural Network (HAGNN) to adaptively capture hierarchical long-range dependencies. Technically, HAGNN creates a hierarchical structure based on directional pair-wise node interactions, revealing underlying hierarchical relationships among nodes. The inferred hierarchy helps to identify certain key nodes, named Source Hubs in our research, which serve as hierarchical contexts for individual nodes. Shortcuts adaptively connect these Source Hubs with distant nodes, enabling efficient message passing for informative long-range interactions. Through comprehensive experiments across multiple datasets, our proposed model outperforms several baseline methods, thus establishing a new state-of-the-art in performance. Further analysis demonstrates the effectiveness of our approach in capturing relevant adaptive hierarchical contexts, leading to improved and explainable node representation.
引用
收藏
页码:365 / 378
页数:14
相关论文
共 74 条
[1]  
Abu-El-Haifa S, 2019, PR MACH LEARN RES, V97
[2]  
Atwood J, 2016, ADV NEUR IN, V29
[3]   Graph Neural Networks With Convolutional ARMA Filters [J].
Bianchi, Filippo Maria ;
Grattarola, Daniele ;
Livi, Lorenzo ;
Alippi, Cesare .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (07) :3496-3507
[4]  
Bruna J., 2014, INT C LEARNING REPRE
[5]  
Chami I, 2019, ADV NEUR IN, V32
[6]   Cluster-GCN: An Efficient Algorithm for Training Deep and Large Graph Convolutional Networks [J].
Chiang, Wei-Lin ;
Liu, Xuanqing ;
Si, Si ;
Li, Yang ;
Bengio, Samy ;
Hsieh, Cho-Jui .
KDD'19: PROCEEDINGS OF THE 25TH ACM SIGKDD INTERNATIONAL CONFERENCCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2019, :257-266
[7]  
Cui X., 2022, P NEURIPS 2022 WORKS, P1
[8]   A physical model for efficient ranking in networks [J].
De Bacco, Caterina ;
Larremore, Daniel B. ;
Moore, Cristopher .
SCIENCE ADVANCES, 2018, 4 (07)
[9]  
Defferrard M, 2016, ADV NEUR IN, V29
[10]   HAKG: Hierarchy-Aware Knowledge Gated Network for Recommendation [J].
Du, Yuntao ;
Zhu, Xinjun ;
Chen, Lu ;
Zheng, Baihua ;
Gao, Yunjun .
PROCEEDINGS OF THE 45TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '22), 2022, :1390-1400