Robust Loss Functions for Training Decision Trees with Noisy Labels

被引:0
作者
Wilton, Jonathan [1 ]
Ye, Nan [1 ]
机构
[1] Univ Queensland, Brisbane, Qld, Australia
来源
THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 14 | 2024年
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We consider training decision trees using noisily labeled data, focusing on loss functions that can lead to robust learning algorithms. Our contributions are threefold. First, we offer novel theoretical insights on the robustness of many existing loss functions in the context of decision tree learning. We show that some of the losses belong to a class of what we call conservative losses, and the conservative losses lead to an early stopping behavior during training and noise-tolerant predictions during testing. Second, we introduce a framework for constructing robust loss functions, called distribution losses. These losses apply percentile-based penalties based on an assumed margin distribution, and they naturally allow adapting to different noise rates via a robustness parameter. In particular, we introduce a new loss called the negative exponential loss, which leads to an efficient greedy impurity-reduction learning algorithm. Lastly, our experiments on multiple datasets and noise settings validate our theoretical insight and the effectiveness of our adaptive negative exponential loss.
引用
收藏
页码:15859 / 15867
页数:9
相关论文
共 50 条
[1]   Learning from Noisy Complementary Labels with Robust Loss Functions [J].
Ishiguro, Hiroki ;
Ishida, Takashi ;
Sugiyama, Masashi .
IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2022, E105D (02) :364-376
[2]   Robust optimal classification trees under noisy labels [J].
Victor Blanco ;
Alberto Japón ;
Justo Puerto .
Advances in Data Analysis and Classification, 2022, 16 :155-179
[3]   Robust optimal classification trees under noisy labels [J].
Blanco, Victor ;
Japon, Alberto ;
Puerto, Justo .
ADVANCES IN DATA ANALYSIS AND CLASSIFICATION, 2022, 16 (01) :155-179
[4]   Asymmetric Loss Functions for Learning with Noisy Labels [J].
Zhou, Xiong ;
Liu, Xianming ;
Jiang, Junjun ;
Gao, Xin ;
Ji, Xiangyang .
INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
[5]   Correction to: Robust optimal classification trees under noisy labels [J].
Victor Blanco ;
Alberto Japón ;
Justo Puerto .
Advances in Data Analysis and Classification, 2022, 16 (4) :1095-1095
[6]   Progressively Robust Loss for Deep Learning with Noisy Labels [J].
Cai, Zhenhuang ;
Yan, Shuai ;
Chen, Yuanbo ;
Zhang, Chuanyi ;
Sun, Zeren ;
Yao, Yazhou .
2024 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN 2024, 2024,
[7]   Robust Training for Speaker Verification against Noisy Labels [J].
Fang, Zhihua ;
He, Liang ;
Ma, Hanhan ;
Guo, Xiaochen ;
Li, Lin .
INTERSPEECH 2023, 2023, :3192-3196
[8]   Learning from Noisy Labels with Complementary Loss Functions [J].
Wang, Deng-Bao ;
Wen, Yong ;
Pan, Lujia ;
Zhang, Min-Ling .
THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 :10111-10119
[9]   Reconstructing Test Labels From Noisy Loss Functions [J].
Aggarwal, Abhinav ;
Kasiviswanathan, Shiva Prasad ;
Xu, Zekun ;
Feyisetan, Oluwaseyi ;
Teissier, Nathanael .
INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
[10]   PeerRank: Robust Learning to Rank With Peer Loss Over Noisy Labels [J].
Wu, Xin ;
Liu, Qing ;
Qin, Jiarui ;
Yu, Yong .
IEEE ACCESS, 2022, 10 :6830-6841