Multi-label classification by formulating label-specific features from simultaneous instance level and feature level

被引:10
作者
Guan, Yuanyuan [1 ]
Li, Wenhui [1 ]
Zhang, Boxiang [1 ]
Han, Bing [2 ]
Ji, Manglai [1 ]
机构
[1] Jilin Univ, Coll Comp Sci & Technol, Jilin, Jilin, Peoples R China
[2] Northeast Normal Univ, Jilin, Jilin, Peoples R China
基金
中国国家自然科学基金;
关键词
Multi-label classification; Binary relevance; Label-specific feature; Feature distribution; SUPERVISED TOPIC MODELS; SELECTION; SPARSE;
D O I
10.1007/s10489-020-02008-4
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multi-label learning (MLL) trains a classification model from multiple labelled datasets, where each training instance is annotated with a set of class labels simultaneously. Following the binary relevance MLL paradigm, a recently effective spirit is to constructing specific features for each label, instead of training over the original feature space. Existing label-specific methods, however, only consider the information from instance distributions, making the reconstructed features poorly discriminative. In this paper, we propose the generation of Label-spEcific feaTures by simultaneously exploring insTance distributions and fEatuRe distributions, and suggest a new method named Letter. Letter reconstructs two subsets of new features from the instance level and feature level, respectively. More concretely, from the instance level, Letter incorporates a sparse constraint, and from the feature level, we cluster the original features to construct new features as an extension. The combination of these two new feature subsets is the final set of label-specific features. Extensive experiments on a total of 14 benchmark datasets verify the competitive performance of Letter against the existing state-of-the-art MLL methods.
引用
收藏
页码:3375 / 3390
页数:16
相关论文
共 39 条
[1]   Learning multi-label scene classification [J].
Boutell, MR ;
Luo, JB ;
Shen, XP ;
Brown, CM .
PATTERN RECOGNITION, 2004, 37 (09) :1757-1771
[2]   Synergy of multi-label hierarchical ensembles, data fusion, and cost-sensitive methods for gene functional inference [J].
Cesa-Bianchi, Nicolo ;
Re, Matteo ;
Valentini, Giorgio .
MACHINE LEARNING, 2012, 88 (1-2) :209-241
[3]   LIBSVM: A Library for Support Vector Machines [J].
Chang, Chih-Chung ;
Lin, Chih-Jen .
ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2011, 2 (03)
[4]   Deep Image Annotation and Classification by Fusing Multi-Modal Semantic Topics [J].
Chen, YongHeng ;
Zhang, Fuquan ;
Zuo, WanLi .
KSII TRANSACTIONS ON INTERNET AND INFORMATION SYSTEMS, 2018, 12 (01) :392-412
[5]   Combining instance-based learning and logistic regression for multilabel classification [J].
Cheng, Weiwei ;
Huellermeier, Eyke .
MACHINE LEARNING, 2009, 76 (2-3) :211-225
[6]  
Demsar J, 2006, J MACH LEARN RES, V7, P1
[7]  
Elisseeff A, 2002, ADV NEUR IN, V14, P681
[8]   Multilabel classification via calibrated label ranking [J].
Fuernkranz, Johannes ;
Huellermeier, Eyke ;
Mencia, Eneldo Loza ;
Brinker, Klaus .
MACHINE LEARNING, 2008, 73 (02) :133-153
[9]   Using multi-label classification to improve object detection [J].
Gong, Tao ;
Liu, Bin ;
Chu, Qi ;
Yu, Nenghai .
NEUROCOMPUTING, 2019, 370 :174-185
[10]   Leveraging Label-Specific Discriminant Mapping Features for Multi-Label Learning [J].
Guo, Yumeng ;
Chung, Fulai ;
Li, Guozheng ;
Wang, Jiancong ;
Gee, James C. .
ACM TRANSACTIONS ON KNOWLEDGE DISCOVERY FROM DATA, 2019, 13 (02)