Multi-label classification by exploiting local positive and negative pairwise label correlation

被引:86
作者
Huang, Jun [1 ,2 ]
Li, Guorong [1 ]
Wang, Shuhui [3 ]
Xue, Zhe [1 ]
Huang, Qingming [1 ,3 ]
机构
[1] Univ Chinese Acad Sci, Sch Comp & Control Engn, Beijing 101408, Peoples R China
[2] Anhui Univ Technol, Sch Comp Sci & Technol, Maanshan 243032, Peoples R China
[3] Chinese Acad Sci, Key Lab Intelligent Informat Proc, Inst Comp Technol, Beijing 100190, Peoples R China
基金
中国国家自然科学基金;
关键词
Multi-label classification; k nearest neighbors; Local label correlation; Positive and negative label correlation;
D O I
10.1016/j.neucom.2016.12.073
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In multi-label learning, each example is represented by a single instance and associated with multiple class labels. Existing multi-label learning algorithms mainly exploit label correlations globally, by assuming that the label correlations are shared by all the examples. Moreover, these multi-label learning algorithms exploit the positive label correlations among different class labels. In practical applications, however, different examples may share different label correlations, and the labels are not only positive correlated, but also mutually exclusive with each other. In this paper, we propose a simple and effective Bayesian model for multi-label classification by exploiting Local positive and negative Pairwise Label Correlations, named LPLC. In the training stage, the positive and negative label correlations of each ground truth label for all the training examples are discovered. In the test stage, the k nearest neighbors and their corresponding positive and negative pairwise label correlations for each test example are first identified, then we make prediction through maximizing the posterior probability, which is estimated on the label distribution, the local positive and negative pairwise label correlations embodied in the k nearest neighbors. A comparative study with the state-of-the-art approaches manifests a competitive performance of our proposed method. (C) 2017 Elsevier B.V. All rights reserved.
引用
收藏
页码:164 / 174
页数:11
相关论文
共 43 条
[21]   Image multi-label annotation based on supervised nonnegative matrix factorization with new matching measurement [J].
Jia, Xu ;
Sun, Fuming ;
Li, Haojie ;
Cao, Yudong ;
Zhang, Xing .
NEUROCOMPUTING, 2017, 219 :518-525
[22]   Neural Conditional Energy Models for Multi-Label Classification [J].
Jing, How ;
Lin, Shou-De .
2014 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2014, :240-249
[23]  
Kang F., 2006, 2006 IEEE COMPUTER S, V2, P1719
[24]  
Kazawa H., 2005, Advances in Neural Information Processing Systems, P649
[25]   Beam search algorithms for multilabel learning [J].
Kumar, Abhishek ;
Vembu, Shankar ;
Menon, Aditya Krishna ;
Elkan, Charles .
MACHINE LEARNING, 2013, 92 (01) :65-89
[26]  
Lin X., 2010, Proceedings of the 19th ACM international conference on Information and knowledge management, P349
[27]   Neighbor selection for multilabel classification [J].
Liu, Huawen ;
Wu, Xindong ;
Zhang, Shichao .
NEUROCOMPUTING, 2016, 182 :187-196
[28]  
Qi G.J., 2007, P 15 ACM INT C MULTI, P17, DOI DOI 10.1145/1291233.1291245
[29]   Efficient monte carlo methods for multi-dimensional learning with classifier chains [J].
Read, Jesse ;
Martino, Luca ;
Luengo, David .
PATTERN RECOGNITION, 2014, 47 (03) :1535-1546
[30]  
Read J, 2009, LECT NOTES ARTIF INT, V5782, P254, DOI 10.1007/978-3-642-04174-7_17