A Time-Consistency Curriculum for Learning From Instance-Dependent Noisy Labels

被引:1
|
作者
Wu, Songhua [1 ,2 ]
Zhou, Tianyi [3 ,4 ]
Du, Yuxuan [5 ]
Yu, Jun [6 ]
Han, Bo [7 ]
Liu, Tongliang [1 ,2 ]
机构
[1] Univ Sci & Technol China USTC, Dept Automat, Hefei 230026, Peoples R China
[2] Univ Sydney, Sch Comp Sci, Trustworthy Machine Learning Lab, Darlington, NSW 2008, Australia
[3] Univ Maryland, Sch Comp Sci, College Pk, MD 20742 USA
[4] Univ Maryland, UMIACS, College Pk, MD 20742 USA
[5] JD Explore Acad, Beijing 10111, Peoples R China
[6] Univ Sci & Technol China, Dept Automat, Hefei 230027, Peoples R China
[7] Hong Kong Baptist Univ, Dept Comp Sci, Kowloon Tong, Hong Kong, Peoples R China
基金
澳大利亚研究理事会;
关键词
Training; Noise measurement; Data models; Computational modeling; Estimation; Predictive models; Computer science; Instance-dependent noisy labels; time-consistent curriculum learning; image classification;
D O I
10.1109/TPAMI.2024.3360623
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Many machine learning algorithms are known to be fragile on simple instance-independent noisy labels. However, noisy labels in real-world data are more devastating since they are produced by more complicated mechanisms in an instance-dependent manner. In this paper, we target this practical challenge of Instance-Dependent Noisy Labels by jointly training (1) a model reversely engineering the noise generating mechanism, which produces an instance-dependent mapping between the clean label posterior and the observed noisy label and (2) a robust classifier that produces clean label posteriors. Compared to previous methods, the former model is novel and enables end-to-end learning of the latter directly from noisy labels. An extensive empirical study indicates that the time-consistency of data is critical to the success of training both models and motivates us to develop a curriculum selecting training data based on their dynamics on the two models' outputs over the course of training. We show that the curriculum-selected data provide both clean labels and high-quality input-output pairs for training the two models. Therefore, it leads to promising and robust classification performance even in notably challenging settings of instance-dependent noisy labels where many SoTA methods could easily fail. Extensive experimental comparisons and ablation studies further demonstrate the advantages and significance of the time-consistency curriculum in learning from instance-dependent noisy labels on multiple benchmark datasets.
引用
收藏
页码:4830 / 4842
页数:13
相关论文
共 19 条
  • [1] Centrality and Consistency: Two-Stage Clean Samples Identification for Learning with Instance-Dependent Noisy Labels
    Zhao, Ganlong
    Li, Guanbin
    Qin, Yipeng
    Liu, Feng
    Yu, Yizhou
    COMPUTER VISION, ECCV 2022, PT XXV, 2022, 13685 : 21 - 37
  • [2] Consistency-Regularized Learning for Remote Sensing Scene Classification With Noisy Labels
    Hu, Ruizhe
    Li, Zuoyong
    Wang, Tao
    Hu, Rong
    Papageorgiou, George N.
    Liu, Guang-Hai
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2024, 21
  • [3] Transferring Annotator- and Instance-Dependent Transition Matrix for Learning From Crowds
    Li, Shikun
    Xia, Xiaobo
    Deng, Jiankang
    Ge, Shiming
    Liu, Tongliang
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2024, 46 (11) : 7377 - 7391
  • [4] Instance-Dependent Inaccurate Label Distribution Learning
    Kou, Zhiqiang
    Wang, Jing
    Jia, Yuheng
    Liu, Biao
    Geng, Xin
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2025, 36 (01) : 1425 - 1437
  • [5] Curriculum-Based Federated Learning for Machine Fault Diagnosis With Noisy Labels
    Sun, Wenjun
    Yan, Ruqiang
    Jin, Ruibing
    Zhao, Rui
    Chen, Zhenghua
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2024, 20 (12) : 13820 - 13830
  • [6] Consistency Regularization on Clean Samples for Learning with Noisy Labels
    Nomura, Yuichiro
    Kurita, Takio
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2022, E105D (02) : 387 - 395
  • [7] Subclass consistency regularization for learning with noisy labels based on contrastive learning
    Sun, Xinkai
    Zhang, Sanguo
    NEUROCOMPUTING, 2025, 614
  • [8] Learning From Noisy Labels With Deep Neural Networks: A Survey
    Song, Hwanjun
    Kim, Minseok
    Park, Dongmin
    Shin, Yooju
    Lee, Jae-Gil
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (11) : 8135 - 8153
  • [9] Learning From Noisy Labels via Dynamic Loss Thresholding
    Yang, Hao
    Jin, You-Zhi
    Li, Zi-Yin
    Wang, Deng-Bao
    Geng, Xin
    Zhang, Min-Ling
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2024, 36 (11) : 6503 - 6516
  • [10] MetaLabelNet: Learning to Generate Soft-Labels From Noisy-Labels
    Algan, Gorkem
    Ulusoy, Ilkay
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2022, 31 : 4352 - 4362