Compact learning for multi-label classification

被引:14
作者
Lv, Jiaqi [1 ,2 ]
Wu, Tianran [1 ,2 ]
Peng, Chenglun [1 ,2 ]
Liu, Yunpeng [1 ,2 ]
Xu, Ning [1 ,2 ]
Geng, Xin [1 ,2 ]
机构
[1] Southeast Univ, Sch Comp Sci & Engn, Nanjing, Peoples R China
[2] Southeast Univ, Minist Educ, Key Lab Comp Network & Informat Integrat, Nanjing, Peoples R China
基金
中国国家自然科学基金;
关键词
Machine learning; Multi-label classification; Label compression; Compact learning; REDUCTION;
D O I
10.1016/j.patcog.2021.107833
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multi-label classification (MLC) studies the problem where each instance is associated with multiple relevant labels, which leads to the exponential growth of output space. It confronts with the great challenge for the exploration of the latent label relationship and the intrinsic correlation between feature and la bel spaces. MLC gave rise to a framework named label compression (LC) to obtain a compact space for efficient learning. Nevertheless, most existing LC methods failed to consider the influence of the feature space or misguided by original problematic features, which may result in performance degradation instead. In this paper, we present a compact learning (CL) framework to embed the features and labels simultaneously and with mutual guidance. The proposal is a versatile concept that does not rigidly adhere to some specific embedding methods, and is independent of the subsequent learning process. Following its spirit, a simple yet effective implementation called compact multi-label learning (CMLL) is proposed to learn a compact low-dimensional representation for both spaces. CMLL maximizes the dependence between the embedded spaces of the labels and features, and minimizes the loss of label space recovery concurrently. Theoretically, we provide a general analysis for different embedding methods. Practically, we conduct extensive experiments to validate the effectiveness of the proposed method. (c) 2021 Elsevier Ltd. All rights reserved.
引用
收藏
页数:11
相关论文
共 36 条
  • [1] [Anonymous], 2012, ADV NEURAL INF PROCE
  • [2] Bhatia Kush, 2015, NIPS, P730
  • [3] Gretton A, 2005, LECT NOTES ARTIF INT, V3734, P63
  • [4] Canonical correlation analysis: An overview with application to learning methods
    Hardoon, DR
    Szedmak, S
    Shawe-Taylor, J
    [J]. NEURAL COMPUTATION, 2004, 16 (12) : 2639 - 2664
  • [5] Hou P, 2016, AAAI CONF ARTIF INTE, P1680
  • [6] Hsu D., 2009, NIPS, V22, P772
  • [7] Jian L, 2016, P 25 INT JOINT C ART, P1627, DOI DOI 10.5555/3060832.3060848
  • [8] Jing LP, 2015, PROC CVPR IEEE, P1483, DOI 10.1109/CVPR.2015.7298755
  • [9] Joly Arnaud, 2014, Machine Learning and Knowledge Discovery in Databases. European Conference, ECML PKDD 2014. Proceedings: LNCS 8724, P607, DOI 10.1007/978-3-662-44848-9_39
  • [10] Group preserving label embedding for multi-label classification
    Kumar, Vikas
    Pujari, Arun K.
    Padmanabhan, Vineet
    Kagita, Venkateswara Rao
    [J]. PATTERN RECOGNITION, 2019, 90 : 23 - 34