Cost-sensitive learning with modified Stein loss function

被引:19
作者
Fu, Saiji [1 ]
Tian, Yingjie [2 ,3 ,4 ,5 ]
Tang, Jingjing [6 ]
Liu, Xiaohui [7 ]
机构
[1] Beijing Univ Posts & Telecommun, Sch Econ & Management, Beijing 100876, Peoples R China
[2] Univ Chinese Acad Sci, Sch Econ & Management, Beijing 100190, Peoples R China
[3] Chinese Acad Sci, Res Ctr Fictitious Econ & Data Sci, Beijing 100190, Peoples R China
[4] Chinese Acad Sci, Key Lab Big Data Min & Knowledge Management, Beijing 100190, Peoples R China
[5] MOE Social Sci Lab Digital Econ Forecasts & Policy, Beijing 100190, Peoples R China
[6] Southwestern Univ Finance & Econ, Fac Business Adm, Sch Business Adm, Sichuan 611130, Peoples R China
[7] Brunel Univ London, Dept Comp Sci, London UB8 3PH, England
基金
中国国家自然科学基金;
关键词
Class imbalance learning; Cost-sensitive learning; Stein loss function; Penalty parameter; Support vector machine; SUPPORT VECTOR MACHINE; CLASS-IMBALANCE; CLASSIFICATION; SMOTE;
D O I
10.1016/j.neucom.2023.01.052
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Cost-sensitive learning (CSL), which has gained widespread attention in class imbalance learning (CIL), can be implemented either by tuning penalty parameters or by designing new loss functions. In this paper, we propose a cost-sensitive learning method with a modified Stein loss function (CSMS) and a robust CSMS (RCSMS). Specifically, CSMS is flexible, as it realizes CSL from above two aspects simultane-ously. In contrast, RCSMS merely achieves CSL by tuning penalty parameters, but the adopted loss func-tion makes it insensitive to noise. To our best knowledge, it is the first time for Stein loss function derived from statistics to be applied in machine learning, which not only offers two alternative class imbalance solutions but also provides a novel idea for the design of loss functions in CIL. The mini-batch stochastic sub-gradient descent (MBGD) approach is employed to optimize CSMS and RCSMS. Meanwhile, the Rademacher complexity is used to analyze their generalization error bounds. Extensive experiments pro-foundly confirm the superiority of both models over benchmarks.(c) 2023 Elsevier B.V. All rights reserved.
引用
收藏
页码:57 / 75
页数:19
相关论文
共 50 条
  • [21] LW-ELM: A Fast and Flexible Cost-Sensitive Learning Framework for Classifying Imbalanced Data
    Yu, Hualong
    Sun, Changyin
    Yang, Xibei
    Zheng, Shang
    Wang, Qi
    Xi, Xiaoyan
    IEEE ACCESS, 2018, 6 : 28488 - 28500
  • [22] Active Learning for Cost-Sensitive Classification
    Krishnamurthy, Akshay
    Agarwal, Alekh
    Huang, Tzu-Kuo
    Daume, Hal, III
    Langford, John
    JOURNAL OF MACHINE LEARNING RESEARCH, 2019, 20
  • [23] Cost-sensitive learning for imbalanced medical data: a review
    Araf, Imane
    Idri, Ali
    Chairi, Ikram
    ARTIFICIAL INTELLIGENCE REVIEW, 2024, 57 (04)
  • [24] Analysis of imbalanced data using cost-sensitive learning
    Kim, Sojin
    Song, Jongwoo
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2025,
  • [25] Robust SVM for Cost-Sensitive Learning
    Jiangzhang Gan
    Jiaye Li
    Yangcai Xie
    Neural Processing Letters, 2022, 54 : 2737 - 2758
  • [26] Cost-Sensitive Boosting
    Masnadi-Shirazi, Hamed
    Vasconcelos, Nuno
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2011, 33 (02) : 294 - 309
  • [27] Adaptive learning cost-sensitive convolutional neural network
    Hou, Yun
    Fan, Hong
    Li, Li
    Li, Bailin
    IET COMPUTER VISION, 2021, 15 (05) : 346 - 355
  • [28] Cost-sensitive learning using logical analysis of data
    Osman, Hany
    KNOWLEDGE AND INFORMATION SYSTEMS, 2024, 66 (06) : 3571 - 3606
  • [29] Cost-sensitive support vector machines
    Iranmehr, Arya
    Masnadi-Shirazi, Hamed
    Vasconcelos, Nuno
    NEUROCOMPUTING, 2019, 343 : 50 - 64
  • [30] Cost-sensitive learning for defect escalation
    Sheng, Victor S.
    Gu, Bin
    Fang, Wei
    Wu, Jian
    KNOWLEDGE-BASED SYSTEMS, 2014, 66 : 146 - 155