Cost-sensitive learning with modified Stein loss function

被引:19
|
作者
Fu, Saiji [1 ]
Tian, Yingjie [2 ,3 ,4 ,5 ]
Tang, Jingjing [6 ]
Liu, Xiaohui [7 ]
机构
[1] Beijing Univ Posts & Telecommun, Sch Econ & Management, Beijing 100876, Peoples R China
[2] Univ Chinese Acad Sci, Sch Econ & Management, Beijing 100190, Peoples R China
[3] Chinese Acad Sci, Res Ctr Fictitious Econ & Data Sci, Beijing 100190, Peoples R China
[4] Chinese Acad Sci, Key Lab Big Data Min & Knowledge Management, Beijing 100190, Peoples R China
[5] MOE Social Sci Lab Digital Econ Forecasts & Policy, Beijing 100190, Peoples R China
[6] Southwestern Univ Finance & Econ, Fac Business Adm, Sch Business Adm, Sichuan 611130, Peoples R China
[7] Brunel Univ London, Dept Comp Sci, London UB8 3PH, England
基金
中国国家自然科学基金;
关键词
Class imbalance learning; Cost-sensitive learning; Stein loss function; Penalty parameter; Support vector machine; SUPPORT VECTOR MACHINE; CLASS-IMBALANCE; CLASSIFICATION; SMOTE;
D O I
10.1016/j.neucom.2023.01.052
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Cost-sensitive learning (CSL), which has gained widespread attention in class imbalance learning (CIL), can be implemented either by tuning penalty parameters or by designing new loss functions. In this paper, we propose a cost-sensitive learning method with a modified Stein loss function (CSMS) and a robust CSMS (RCSMS). Specifically, CSMS is flexible, as it realizes CSL from above two aspects simultane-ously. In contrast, RCSMS merely achieves CSL by tuning penalty parameters, but the adopted loss func-tion makes it insensitive to noise. To our best knowledge, it is the first time for Stein loss function derived from statistics to be applied in machine learning, which not only offers two alternative class imbalance solutions but also provides a novel idea for the design of loss functions in CIL. The mini-batch stochastic sub-gradient descent (MBGD) approach is employed to optimize CSMS and RCSMS. Meanwhile, the Rademacher complexity is used to analyze their generalization error bounds. Extensive experiments pro-foundly confirm the superiority of both models over benchmarks.(c) 2023 Elsevier B.V. All rights reserved.
引用
收藏
页码:57 / 75
页数:19
相关论文
共 50 条
  • [1] Universum driven cost-sensitive learning method with asymmetric loss function
    Liu, Dalian
    Fu, Saiji
    Tian, Yingjie
    Tang, Jingjing
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2024, 131
  • [2] Cost-sensitive positive and unlabeled learning
    Chen, Xiuhua
    Gong, Chen
    Yang, Jian
    INFORMATION SCIENCES, 2021, 558 : 229 - 245
  • [3] Incremental Cost-Sensitive Support Vector Machine With Linear-Exponential Loss
    Ma, Yue
    Zhao, Kun
    Wang, Qi
    Tian, Yingjie
    IEEE ACCESS, 2020, 8 : 149899 - 149914
  • [4] Cost-sensitive ensemble learning: a unifying framework
    George Petrides
    Wouter Verbeke
    Data Mining and Knowledge Discovery, 2022, 36 : 1 - 28
  • [5] Cost-sensitive ensemble learning: a unifying framework
    Petrides, George
    Verbeke, Wouter
    DATA MINING AND KNOWLEDGE DISCOVERY, 2022, 36 (01) : 1 - 28
  • [6] On the Role of Cost-Sensitive Learning in Imbalanced Data Oversampling
    Krawczyk, Bartosz
    Wozniak, Michal
    COMPUTATIONAL SCIENCE - ICCS 2019, PT III, 2019, 11538 : 180 - 191
  • [7] Cost-Sensitive Learning with Noisy Labels
    Natarajan, Nagarajan
    Dhillon, Inderjit S.
    Ravikumar, Pradeep
    Tewari, Ambuj
    JOURNAL OF MACHINE LEARNING RESEARCH, 2018, 18 : 1 - 33
  • [8] Roulette sampling for cost-sensitive learning
    Sheng, Victor S.
    Ling, Charles X.
    MACHINE LEARNING: ECML 2007, PROCEEDINGS, 2007, 4701 : 724 - +
  • [9] Multi-view cost-sensitive kernel learning for imbalanced classification problem
    Tang, Jingjing
    Hou, Zhaojie
    Yu, Xiaotong
    Fu, Saiji
    Tian, Yingjie
    NEUROCOMPUTING, 2023, 552
  • [10] Cost-sensitive learning based on Bregman divergences
    Santos-Rodriguez, Raul
    Guerrero-Curieses, Alicia
    Alaiz-Rodriguez, Rocio
    Cid-Sueiro, Jesus
    MACHINE LEARNING, 2009, 76 (2-3) : 271 - 285