Semi-Supervised Mixture Learning for Graph Neural Networks With Neighbor Dependence

被引:7
作者
Liu, Kai [1 ]
Liu, Hongbo [1 ]
Wang, Tao [2 ]
Hu, Guoqiang [1 ]
Ward, Tomas E. E. [3 ]
Chen, C. L. Philip [4 ]
机构
[1] Dalian Maritime Univ, Coll Artificial Intelligence, Dalian 116026, Peoples R China
[2] Horizon Robot Inc, Beijing 100190, Peoples R China
[3] Dublin City Univ, Insight SFI Res Ctr Data Analyt, Dublin D09WK2D, Ireland
[4] South China Univ Technol, Sch Comp Sci & Engn, Guangzhou 510641, Peoples R China
基金
中国国家自然科学基金; 爱尔兰科学基金会;
关键词
Generalized expectation-maximization (GEM); graph neural networks (GNNs); neighbor dependence; semi-supervised learning (SSL); CONVOLUTIONAL NETWORKS;
D O I
10.1109/TNNLS.2023.3263463
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A graph neural network (GNN) is a powerful architecture for semi-supervised learning (SSL). However, the data-driven mode of GNNs raises some challenging problems. In particular, these models suffer from the limitations of incomplete attribute learning, insufficient structure capture, and the inability to distinguish between node attribute and graph structure, especially on label-scarce or attribute-missing data. In this article, we propose a novel framework, called graph coneighbor neural network (GCoNN), for node classification. It is composed of two modules: GCoNN(G) and GCoNN (G)?. GCoNN(G) is trained to establish the fundamental prototype for attribute learning on labeled data, while GCoNNi, learns neighbor dependence on transductive data through pseudolabels generated by GCoNN(G). Next, GCoNN(G) is retrained to improve integration of node attribute and neighbor structure through feedback from GCoNN(G)?. GCoNN tends to convergence iteratively using such 0 an approach. From a theoretical perspective, we analyze this iteration process from a generalized expectation-maximization (GEM) framework perspective which optimizes an evidence lower bound (ELBO) by amortized variational inference. Empirical evidence demonstrates that the state-of-the-art performance of the proposed approach outperforms other methods. We also apply GCoNN to brain functional networks, the results of which reveal response features across the brain which are physiologically plausible with respect to known language and visual functions.
引用
收藏
页码:12528 / 12539
页数:12
相关论文
共 56 条
[1]  
Aggarwal C. C, 2020, Linear Algebra and Optimization for Machine Learning
[2]  
Balcilar M, 2021, PR MACH LEARN RES, V139
[3]  
Bevilacqua B., 2022, PROC 10 INT C LEARN
[4]   A Comprehensive Survey of Graph Embedding: Problems, Techniques, and Applications [J].
Cai, HongYun ;
Zheng, Vincent W. ;
Chang, Kevin Chen-Chuan .
IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2018, 30 (09) :1616-1637
[5]   A Semisupervised Recurrent Convolutional Attention Model for Human Activity Recognition [J].
Chen, Kaixuan ;
Yao, Lina ;
Zhang, Dalin ;
Wang, Xianzhi ;
Chang, Xiaojun ;
Nie, Feiping .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (05) :1747-1756
[6]   A Space Affine Matching Approach to fMRI Time Series Analysis [J].
Chen, Liang ;
Zhang, Weishi ;
Liu, Hongbo ;
Feng, Shigang ;
Chen, C. L. Philip ;
Wang, Huili .
IEEE TRANSACTIONS ON NANOBIOSCIENCE, 2016, 15 (05) :468-480
[7]  
Chen Ming, 2020, P MACHINE LEARNING R, V119
[8]   On Inductive-Transductive Learning With Graph Neural Networks [J].
Ciano, Giorgio ;
Rossi, Alberto ;
Bianchini, Monica ;
Scarselli, Franco .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (02) :758-769
[9]   Differential abundance testing on single-cell data using k-nearest neighbor graphs [J].
Dann, Emma ;
Henderson, Neil C. ;
Teichmann, Sarah A. ;
Morgan, Michael D. ;
Marioni, John C. .
NATURE BIOTECHNOLOGY, 2022, 40 (02) :245-+
[10]  
DOUC R., 2018, Markov Chains