Leveraging Implicit Relative Labeling-Importance Information for Effective Multi-Label Learning

被引:86
作者
Li, Yu-Kun [1 ,2 ]
Zhang, Min-Ling [1 ,2 ]
Geng, Xin [1 ,2 ]
机构
[1] Southeast Univ, Sch Comp Sci & Engn, Nanjing 210096, Jiangsu, Peoples R China
[2] Southeast Univ, Minist Educ, Key Lab Comp Network & Informat Integrat, Nanjing, Jiangsu, Peoples R China
来源
2015 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM) | 2015年
关键词
multi-label learning; relative labeling-importance; label distribution;
D O I
10.1109/ICDM.2015.41
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In multi-label learning, each training example is represented by a single instance while associated with multiple labels, and the task is to predict a set of relevant labels for the unseen instance. Existing approaches learn from multi-label data by assuming equal labeling-importance, i.e. all the associated labels are regarded to be relevant while their relative importance for the training example are not differentiated. Nonetheless, this assumption fails to reflect the fact that the importance degree of each associated label is generally different, though the importance information is not explicitly accessible from the training examples. In this paper, we show that effective multi-label learning can be achieved by leveraging the implicit relative labeling-importance (RLI) information. Specifically, RLI degrees are formalized as multinomial distribution over the label space, which are estimated by adapting an iterative label propagation procedure. After that, the multi-label prediction model is learned by fitting the estimated multinomial distribution as regularized with popular multi-label empirical loss. Comprehensive experiments clearly validate the usefulness of leveraging implicit RLI information to learn from multi-label data.
引用
收藏
页码:251 / 260
页数:10
相关论文
共 33 条
[1]  
[Anonymous], 2005, PROC 14 ACM INT C I, DOI DOI 10.1145/1099554.1099591
[2]  
[Anonymous], 2009, Introduction to Semi-Supervised Learning, DOI [DOI 10.2200/S00196ED1V01Y200906AIM006, 10.2200/S00196ED1V01Y200906AIM006]
[3]   Learning multi-label scene classification [J].
Boutell, MR ;
Luo, JB ;
Shen, XP ;
Brown, CM .
PATTERN RECOGNITION, 2004, 37 (09) :1757-1771
[4]  
Briggs F., 2012, P 18 ACM SIGKDD INT, P534
[5]  
Cabral R. S., 2011, Advances in neural information processing systems, P190
[6]  
Cheng W., 2010, P 27 INT C MACHINE L, P223
[7]   Inducing features of random fields [J].
DellaPietra, S ;
DellaPietra, V ;
Lafferty, J .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1997, 19 (04) :380-393
[8]  
Demsar J, 2006, J MACH LEARN RES, V7, P1
[9]  
Elisseeff A, 2002, ADV NEUR IN, V14, P681
[10]   Multilabel classification via calibrated label ranking [J].
Fuernkranz, Johannes ;
Huellermeier, Eyke ;
Mencia, Eneldo Loza ;
Brinker, Klaus .
MACHINE LEARNING, 2008, 73 (02) :133-153