Few-shot partial multi-label learning with credible non-candidate label

被引:0
作者
Wang, Meng [1 ,2 ]
Zhao, Yunfeng [1 ,2 ]
Yan, Zhongmin [1 ,2 ]
Zhang, Jinglin [3 ]
Wang, Jun [1 ,2 ]
Yu, Guoxian [1 ,2 ]
机构
[1] Shandong Univ, Sch Software, Jinan, Peoples R China
[2] Shandong Univ, Joint SDU NTU Ctr Artificial Intelligence Res, Jinan, Peoples R China
[3] Shandong Univ, Sch Control Sci & Engn, Jinan, Peoples R China
关键词
Partial multi-label learning; Few-shot learning; Credible non-candidate label; Data augmentation;
D O I
10.1016/j.ins.2025.122485
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Partial multi-label learning (PML) addresses scenarios where each training sample is associated with multiple candidate labels, but only a subset are ground-truth labels. The primary difficulty in PML is to mitigate the negative impact of noisy labels. Most existing PML methods rely on sufficient samples to train a noise-robust multi-label classifier. However, in practical scenarios, such as privacy-sensitive domains or those with limited data, only a few training samples are typically available for the target task. In this paper, we propose an approach called FsPML-CNL (Few-shot Partial Multi-label Learning with Credible Non-candidate Label) to tackle the PML problem with few-shot training samples. Specifically, FsPML-CNL first utilizes the sample features and feature-prototype similarity in the embedding space to disambiguate candidate labels and to obtain label prototypes. Then, the credible non-candidate label is selected based on label correlation and confidence, and its prototype is incorporated into the training samples to generate new data for boosting supervised information. The noise-tolerant multi-label classifier is finally induced with the original and generated samples, along with the confidence-guided loss. Extensive experiments on public datasets demonstrate that FsPML-CNL outperforms competitive baselines across different settings.
引用
收藏
页数:21
相关论文
共 42 条
[11]   Multi-Organ Segmentation Over Partially Labeled Datasets With Multi-Scale Feature Abstraction [J].
Fang, Xi ;
Yan, Pingkun .
IEEE TRANSACTIONS ON MEDICAL IMAGING, 2020, 39 (11) :3619-3629
[12]   Research Review for Broad Learning System: Algorithms, Theory, and Applications [J].
Gong, Xinrong ;
Zhang, Tong ;
Chen, C. L. Philip ;
Liu, Zhulin .
IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (09) :8922-8950
[13]   Integrating label confidence-based feature selection for partial multi-label learning [J].
Han, Qingqi ;
Hu, Liang ;
Gao, Wanfu .
PATTERN RECOGNITION, 2025, 161
[14]   Partial multi-label feature selection via subspace optimization [J].
Hao, Pingting ;
Hu, Liang ;
Gao, Wanfu .
INFORMATION SCIENCES, 2023, 648
[15]   Deep Residual Learning for Image Recognition [J].
He, Kaiming ;
Zhang, Xiangyu ;
Ren, Shaoqing ;
Sun, Jian .
2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, :770-778
[16]  
Kingma D.P., 2014, 3 INT C LEARN REPR I, P1
[17]   Microsoft COCO: Common Objects in Context [J].
Lin, Tsung-Yi ;
Maire, Michael ;
Belongie, Serge ;
Hays, James ;
Perona, Pietro ;
Ramanan, Deva ;
Dollar, Piotr ;
Zitnick, C. Lawrence .
COMPUTER VISION - ECCV 2014, PT V, 2014, 8693 :740-755
[18]   Towards Enabling Binary Decomposition for Partial Multi-Label Learning [J].
Liu, Bing-Qing ;
Jia, Bin-Bin ;
Zhang, Min-Ling .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (11) :13203-13217
[19]   Label-enhanced Prototypical Network with Contrastive Learning for Multi-label Few-shot Aspect Category Detection [J].
Liu, Han ;
Zhang, Feng ;
Zhang, Xiaotong ;
Zhao, Siyang ;
Sun, Junjie ;
Yu, Hong ;
Zhang, Xianchao .
PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, :1079-1087
[20]   Noisy label tolerance: A new perspective of Partial Multi-Label Learning [J].
Lyu, Gengyu ;
Feng, Songhe ;
Li, Yidong .
INFORMATION SCIENCES, 2021, 543 :454-466