Privileged Multi-label Learning

被引:0
|
作者
You, Shan [1 ,2 ]
Xu, Chang [3 ]
Wang, Yunhe [1 ,2 ]
Xu, Chao [1 ,2 ]
Tao, Dacheng [3 ]
机构
[1] Peking Univ, Key Lab Machine Percept MOE, Sch EECS, Beijing, Peoples R China
[2] Peking Univ, Cooperat Medianet Innovat Ctr, Beijing, Peoples R China
[3] Univ Sydney, FEIT, UBTech Sydney AI Inst, Sch IT, Sydney, NSW, Australia
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper presents privileged multi-label learning (PrML) to explore and exploit the relationship between labels in multi-label learning problems. We suggest that for each individual label, it cannot only be implicitly connected with other labels via the low-rank constraint over label predictors, but also its performance on examples can receive the explicit comments from other labels together acting as an Oracle teacher. We generate privileged label feature for each example and its individual label, and then integrate it into the framework of low-rank based multi-label learning. The proposed algorithm can therefore comprehensively explore and exploit label relationships by inheriting all the merits of privileged information and low-rank constraints. We show that PrML can be efficiently solved by dual coordinate descent algorithm using iterative optimization strategy with cheap updates. Experiments on benchmark datasets show that through privileged label features, the performance can be significantly improved and PrML is superior to several competing methods in most cases.
引用
收藏
页码:3336 / 3342
页数:7
相关论文
共 50 条
  • [1] Privileged Label Enhancement with Multi-Label Learning
    Zhu, Wenfang
    Jia, Xiuyi
    Li, Weiwei
    PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 2376 - 2382
  • [2] Learning with privileged information for multi-Label classification
    Wang, Shangfei
    Chen, Shiyu
    Chen, Tanfang
    Shi, Xiaoxiao
    PATTERN RECOGNITION, 2018, 81 : 60 - 70
  • [3] A new multi-view multi-label model with privileged information learning
    Xiao, Yanshan
    Chen, Junfeng
    Liu, Bo
    Zhao, Liang
    Kong, Xiangjun
    Hao, Zhifeng
    INFORMATION SCIENCES, 2024, 656
  • [4] Multi-Label Learning with Weak Label
    Sun, Yu-Yin
    Zhang, Yin
    Zhou, Zhi-Hua
    PROCEEDINGS OF THE TWENTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE (AAAI-10), 2010, : 593 - 598
  • [5] Multi-Label Learning with Label Enhancement
    Shao, Ruifeng
    Xu, Ning
    Geng, Xin
    2018 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2018, : 437 - 446
  • [6] Compact Multi-Label Learning
    Shen, Xiaobo
    Liu, Weiwei
    Tsang, Ivor W.
    Sun, Quan-Sen
    Ong, Yew-Soon
    THIRTY-SECOND AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTIETH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / EIGHTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2018, : 4066 - 4073
  • [7] Multi-label Ensemble Learning
    Shi, Chuan
    Kong, Xiangnan
    Yu, Philip S.
    Wang, Bai
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, PT III, 2011, 6913 : 223 - 239
  • [8] Copula Multi-label Learning
    Liu, Weiwei
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [9] On the consistency of multi-label learning
    Gao, Wei
    Zhou, Zhi-Hua
    ARTIFICIAL INTELLIGENCE, 2013, 199 : 22 - 44
  • [10] Multi-Label Manifold Learning
    Hou, Peng
    Geng, Xin
    Zhang, Min-Ling
    THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2016, : 1680 - 1686