FGTL: Federated Graph Transfer Learning for Node Classification

被引:0
作者
Mai, Chengyuan [1 ]
Liao, Tianchi [1 ]
Chen, Chuan [1 ]
Zheng, Zibin [2 ]
机构
[1] Sun Yat Sen Univ, Guangzhou, Peoples R China
[2] Sun Yat Sen Univ, Zhuhai, Peoples R China
基金
国家重点研发计划; 中国国家自然科学基金;
关键词
Federated learning; Graph neural network; Transfer learning; Contrastive learning; Consensus knowledge;
D O I
10.1145/3699962
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Unsupervised multi-source domain transfer in federated scenario has become an emerging research direction, which can help unlabeled target domain to obtain the adapted model through source domains under privacy preserving. However, when local data are graph, the difference of domains (or data heterogeneity) mainly originates from the difference in node attributes and sub-graph structures, leading to serious model drift, which is not considered by the existing related algorithms. Currently, there are two challenges in this scenario: (1) The node representations extracted directly through conventional GNNs lack inter-domain generalized and consistent information, making it difficult to apply existing federated learning algorithms. (2) The knowledge of source domains has quality differences, which may lead to negative transfer. To address these issues, we propose a novel two-phase Federated Graph Transfer Learning (FGTL) framework. In the generalization phase, FGTL utilizes local contrastive learning and global context embedding to force node representations to capture the inter-domain generalized and consistent information, lightly alleviating model drift. In the transfer phase, FGTL utilizes consensus knowledge to force the decision bound of classifier to adapt to the target client. In addition, FGTL+ exploits model grouping to make consensus knowledge generation more efficient, further enhancing the scalability of FGTL. Extensive experiments show that FGTL significantly outperforms state-of-the-art related methods, while FGTL+ further enhances privacy protection and reduces both communication and computation overhead.
引用
收藏
页数:20
相关论文
共 31 条
[1]  
Bao W., 2024, Advances in Neural Information Processing Systems, V36
[2]  
Chen C., 2020, arXiv
[3]   Federated knowledge graph completion via embedding-contrastive learning [J].
Chen, Mingyang ;
Zhang, Wen ;
Yuan, Zonggang ;
Jia, Yantao ;
Chen, Huajun .
KNOWLEDGE-BASED SYSTEMS, 2022, 252
[4]   Graph Transfer Learning via Adversarial Domain Adaptation With Graph Convolution [J].
Dai, Quanyu ;
Wu, Xiao-Ming ;
Xiao, Jiaren ;
Shen, Xiao ;
Wang, Dan .
IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (05) :4908-4922
[5]  
Feng HZ, 2021, PR MACH LEARN RES, V139
[6]   Semi-Supervised Federated Heterogeneous Transfer Learning [J].
Feng, Siwei ;
Li, Boyang ;
Yu, Han ;
Liu, Yang ;
Yang, Qiang .
KNOWLEDGE-BASED SYSTEMS, 2022, 252
[7]  
Jiang YD, 2019, Arxiv, DOI arXiv:1912.02178
[8]  
Karimireddy SP, 2020, PR MACH LEARN RES, V119
[9]   Federated Learning on Non-IID Data Silos: An Experimental Study [J].
Li, Qinbin ;
Diao, Yiqun ;
Chen, Quan ;
He, Bing Sheng .
2022 IEEE 38TH INTERNATIONAL CONFERENCE ON DATA ENGINEERING (ICDE 2022), 2022, :965-978
[10]   Model-Contrastive Federated Learning [J].
Li, Qinbin ;
He, Bingsheng ;
Song, Dawn .
2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, :10708-10717