Multiple Graph Label Propagation by Sparse Integration

被引:122
作者
Karasuyama, Masayuki [1 ]
Mamitsuka, Hiroshi [1 ]
机构
[1] Kyoto Univ, Bioinformat Ctr, Inst Chem Res, Uji, Kyoto 6110011, Japan
关键词
Graph-based semisupervised learning; label propagation; multiple graph integration; sparsity; NETWORK INTEGRATION; CLASSIFICATION; REGULARIZATION; FRAMEWORK; GENEMANIA;
D O I
10.1109/TNNLS.2013.2271327
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph-based approaches have been most successful in semisupervised learning. In this paper, we focus on label propagation in graph-based semisupervised learning. One essential point of label propagation is that the performance is heavily affected by incorporating underlying manifold of given data into the input graph. The other more important point is that in many recent real-world applications, the same instances are represented by multiple heterogeneous data sources. A key challenge under this setting is to integrate different data representations automatically to achieve better predictive performance. In this paper, we address the issue of obtaining the optimal linear combination of multiple different graphs under the label propagation setting. For this problem, we propose a new formulation with the sparsity (in coefficients of graph combination) property which cannot be rightly achieved by any other existing methods. This unique feature provides two important advantages: 1) the improvement of prediction performance by eliminating irrelevant or noisy graphs and 2) the interpretability of results, i.e., easily identifying informative graphs on classification. We propose efficient optimization algorithms for the proposed approach, by which clear interpretations of the mechanism for sparsity is provided. Through various synthetic and two real-world data sets, we empirically demonstrate the advantages of our proposed approach not only in prediction performance but also in graph selection ability.
引用
收藏
页码:1999 / 2012
页数:14
相关论文
共 50 条
[1]  
[Anonymous], 2006, Semi-supervised learning
[2]  
[Anonymous], 2005, P INT WORKSH ART INT
[3]  
[Anonymous], P WORKSH LEARN MULT
[4]  
[Anonymous], 2005, SEMISUPERVISED LEARN
[5]  
[Anonymous], 2006, BOOK REV IEEE T NEUR
[6]  
[Anonymous], 2001, P NEUR INF PROC SYST
[7]  
[Anonymous], 2007, P 24 INT C MACH LEAR
[8]  
[Anonymous], 1993, P ADV NEURAL INFORM
[9]  
[Anonymous], P 24 INT C MACH LEAR
[10]  
[Anonymous], 2004, ADV NEURAL INFORM PR