Robust Semi-supervised Representation Learning for Graph-Structured Data

被引:3
作者
Guo, Lan-Zhe [1 ]
Han, Tao [1 ]
Li, Yu-Feng [1 ]
机构
[1] Nanjing Univ, Natl Key Lab Novel Software Technol, Nanjing 210023, Peoples R China
来源
ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2019, PT III | 2019年 / 11441卷
基金
中国国家自然科学基金; 国家重点研发计划;
关键词
Robust; Representation learning; Semi-supervised learning; Graph convolutional network;
D O I
10.1007/978-3-030-16142-2_11
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The success of machine learning algorithms generally depends on data representation and recently many representation learning methods have been proposed. However, learning a good representation may not always benefit the classification tasks. It sometimes even hurt the performance as the learned representation maybe not related to the ultimate tasks, especially when the labeled examples are few to afford a reliable model selection. In this paper, we propose a novel robust semi-supervised graph representation learning method based on graph convolutional network. To make the learned representation more related to the ultimate classification task, we propose to extend label information based on the smooth assumption and obtain pseudo-labels for unlabeled nodes. Moreover, to make the model robust with noise in the pseudo-label, we propose to apply a large margin classifier to the learned representation. Influenced by the pseudo-label and the large-margin principle, the learned representation can not only exploit the label information encoded in the graph-structure sufficiently but also can produce a more rigorous decision boundary. Experiments demonstrate the superior performance of the proposal over many related methods.
引用
收藏
页码:131 / 143
页数:13
相关论文
共 22 条
[1]  
Belkin M, 2006, J MACH LEARN RES, V7, P2399
[2]   Representation Learning: A Review and New Perspectives [J].
Bengio, Yoshua ;
Courville, Aaron ;
Vincent, Pascal .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2013, 35 (08) :1798-1828
[3]   Latent Dirichlet allocation [J].
Blei, DM ;
Ng, AY ;
Jordan, MI .
JOURNAL OF MACHINE LEARNING RESEARCH, 2003, 3 (4-5) :993-1022
[4]  
Glorot X., 2010, P 13 INT C ART INT S, V9, P249
[5]  
Goodfellow I, 2016, ADAPT COMPUT MACH LE, P1
[6]   node2vec: Scalable Feature Learning for Networks [J].
Grover, Aditya ;
Leskovec, Jure .
KDD'16: PROCEEDINGS OF THE 22ND ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2016, :855-864
[7]  
Hamilton W.L., 2017, IEEE DATA ENG B, V52-74
[8]  
Kingma DP, 2014, ARXIV
[9]  
Kipf T N, 2016, ICLR
[10]   A Real Linear and Parallel Multiple Longest Common Subsequences (MLCS) Algorithm [J].
Li, Yanni ;
Li, Hui ;
Duan, Tihua ;
Wang, Sheng ;
Wang, Zhi ;
Cheng, Yang .
KDD'16: PROCEEDINGS OF THE 22ND ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2016, :1725-1734