Semi-Supervised Learning With Label Proportion

被引:5
|
作者
Sun, Ningzhao [1 ]
Luo, Tingjin [1 ]
Zhuge, Wenzhang [1 ]
Tao, Hong [1 ]
Hou, Chenping [1 ]
Hu, Dewen [2 ]
机构
[1] Natl Univ Def Technol, Coll Liberal Arts & Sci, Changsha 410073, Hunan, Peoples R China
[2] Natl Univ Def Technol, Coll Intelligence Sci & Technol, Changsha 410073, Peoples R China
关键词
Semi-supervised learning; label proportion; constrained submodular minimization; imbalanced classification;
D O I
10.1109/TKDE.2021.3076457
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The scarcity of labels is common and great challenge in traditional supervised learning. Semi-supervised learning (SSL) leverages unlabeled samples to alleviate the absence of label information. Similar with annotation, label proportion is another type of prior information and plays a significant role in classification tasks. Compared with the acquisition of labels, label proportion can be obtained more easily. For example, only a small number of patients have been diagnosed with or not with cancers in hospital database, while the proportion with cancer can be generally estimated by historical records. How to incorporate such prior information of label proportion is crucial but rarely studied in literature. Traditional SSL methods often ignore this prior information and will lead to performance degradation inevitably. To solve this problem, we propose a novel SSL with Label Proportion (SSLLP). Our approach encourages to preserve label consistency and label proportion by imposing the cardinality bound constraints. Our formulated problem equals to a mixed-integer constrained submodular minimization and it is difficult to be solved directly. Therefore, we transformed the original problem into a convex one by Lovasz extension and designed an efficient solving algorithm. Extensive experimental results present the improved performance of our method over several state-of-the-art methods.
引用
收藏
页码:877 / 890
页数:14
相关论文
共 50 条
  • [1] note on label propagation for semi-supervised learning
    Bodo, Zalan
    Csato, Lehel
    ACTA UNIVERSITATIS SAPIENTIAE INFORMATICA, 2015, 7 (01) : 18 - 30
  • [2] LABEL REUSE FOR EFFICIENT SEMI-SUPERVISED LEARNING
    Hsieh, Tsung-Hung
    Chen, Jun-Cheng
    Chen, Chu-Song
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 3697 - 3701
  • [3] Label Propagation for Deep Semi-supervised Learning
    Iscen, Ahmet
    Tolias, Giorgos
    Avrithis, Yannis
    Chum, Ondrej
    2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 5065 - 5074
  • [4] Logistic Label Propagation for Semi-supervised Learning
    Watanabe, Kenji
    Kobayashi, Takumi
    Otsu, Nobuyuki
    NEURAL INFORMATION PROCESSING: THEORY AND ALGORITHMS, PT I, 2010, 6443 : 462 - 469
  • [5] Semi-supervised Learning by Spectral Mapping with Label Information
    Zhao, Zhong-Qiu
    Gao, Jun
    Wu, Xindong
    ARTIFICIAL INTELLIGENCE AND COMPUTATIONAL INTELLIGENCE, PT I, 2010, 6319 : 448 - +
  • [6] ReLaB: Reliable Label Bootstrapping for Semi-Supervised Learning
    Albert, Paul
    Ortego, Diego
    Arazo, Eric
    O'Connor, Noel
    McGuinness, Kevin
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [7] Cyclic label propagation for graph semi-supervised learning
    Li, Zhao
    Liu, Yixin
    Zhang, Zhen
    Pan, Shirui
    Gao, Jianliang
    Bu, Jiajun
    WORLD WIDE WEB-INTERNET AND WEB INFORMATION SYSTEMS, 2022, 25 (02): : 703 - 721
  • [8] Cyclic label propagation for graph semi-supervised learning
    Zhao Li
    Yixin Liu
    Zhen Zhang
    Shirui Pan
    Jianliang Gao
    Jiajun Bu
    World Wide Web, 2022, 25 : 703 - 721
  • [9] Robust Semi-Supervised Learning through Label Aggregation
    Yan, Yan
    Xu, Zhongwen
    Tsang, Ivor W.
    Long, Guodong
    Yang, Yi
    THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2016, : 2244 - 2250
  • [10] Semi-Supervised Partial Multi-Label Learning
    Xie, Ming-Kun
    Huang, Sheng-Jun
    20TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM 2020), 2020, : 691 - 700