Robust regression under the general framework of bounded loss functions

被引:13
|
作者
Fu, Saiji [1 ]
Tian, Yingjie [2 ,3 ,4 ,5 ]
Tang, Long [6 ]
机构
[1] Beijing Univ Posts & Telecommun, Sch Econ & Management, Beijing 100876, Peoples R China
[2] Univ Chinese Acad Sci, Sch Econ & Management, Beijing 100190, Peoples R China
[3] Chinese Acad Sci, Res Ctr Fictitious Econ & Data Sci, Beijing 100190, Peoples R China
[4] Chinese Acad Sci, Key Lab Big Data Min & Knowledge Management, Beijing 100190, Peoples R China
[5] MOE Social Sci Lab Digital Econ Forecasts & Policy, Beijing 100190, Peoples R China
[6] Nanjing Univ Informat Sci & Technol, Sch Artificial Intelligence, Nanjing 210044, Peoples R China
基金
中国国家自然科学基金;
关键词
Robustness and sensitivity analysis; Bounded loss function; Regression; Least squares loss function; Support vector regression; SUPPORT VECTOR REGRESSION; MACHINE; SPARSE; BLINEX;
D O I
10.1016/j.ejor.2023.04.025
中图分类号
C93 [管理学];
学科分类号
12 ; 1201 ; 1202 ; 120202 ;
摘要
Conventional regression methods often fail when encountering noise. The application of a bounded loss function is an effective means to enhance regressor robustness. However, most bounded loss functions ex-ist in Ramp-style forms, losing some inherent properties of the original function due to hard truncation. Besides, there is currently no unified framework on how to design bounded loss functions. In response to the above two issues, this paper proposes a general framework that can smoothly and adaptively bound any non-negative function. It can not only degenerate to the original function, but also inherit its elegant properties, including symmetry, differentiability and smoothness. Under this framework, a robust regres-sor called bounded least squares support vector regression (BLSSVR) is proposed to mitigate the effects of noise and outliers by limiting the maximum loss. With appropriate parameters, the bounded least squares loss grows faster than its unbounded form in the initial stage, which facilitates BLSSVR to assign larger weights to non-outlier points. Meanwhile, the Nesterov accelerated gradient (NAG) algorithm is employed to optimize BLSSVR. Extensive experiments on synthetic and real-world datasets profoundly demonstrate the superiority of BLSSVR over benchmark methods. & COPY; 2023 Elsevier B.V. All rights reserved.
引用
收藏
页码:1325 / 1339
页数:15
相关论文
共 50 条
  • [1] Bounded quantile loss for robust support vector machines-based classification and regression
    Zhang, Jiaqi
    Yang, Hu
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 242
  • [2] Advancing robust regression: Addressing asymmetric noise with the BLINEX loss function
    Tang, Jingjing
    Liu, Bangxin
    Fu, Saiji
    Tian, Yingjie
    Kou, Gang
    INFORMATION FUSION, 2024, 110
  • [3] Robust support vector regression with generic quadratic nonconvex ε-insensitive loss
    Ye, Yafen
    Gao, Junbin
    Shao, Yuanhai
    Li, Chunna
    Jin, Yan
    Hua, Xiangyu
    APPLIED MATHEMATICAL MODELLING, 2020, 82 (82) : 235 - 251
  • [4] Generalized robust loss functions for machine learning
    Fu, Saiji
    Wang, Xiaoxiao
    Tang, Jingjing
    Lan, Shulin
    Tian, Yingjie
    NEURAL NETWORKS, 2024, 171 : 200 - 214
  • [5] On robust Bayesian estimation under some asymmetric and bounded loss function
    Kaminska, Agnieszka
    Porosinski, Zdzislaw
    STATISTICS, 2009, 43 (03) : 253 - 265
  • [6] Robust regression framework with asymmetrically analogous to correntropy-induced loss
    Yang, Liming
    Ding, Guangsheng
    Yuan, Chao
    Zhang, Min
    KNOWLEDGE-BASED SYSTEMS, 2020, 191
  • [7] A robust algorithm of support vector regression with a trimmed Huber loss function in the primal
    Chen, Chuanfa
    Yan, Changqing
    Zhao, Na
    Guo, Bin
    Liu, Guolin
    SOFT COMPUTING, 2017, 21 (18) : 5235 - 5243
  • [8] Maximum likelihood optimal and robust Support Vector Regression with lncosh loss function
    Karal, Omer
    NEURAL NETWORKS, 2017, 94 : 1 - 12
  • [9] A Framework Based on Support Vector Regression for Robust Optimization
    Yao, Biqiang
    MANUFACTURING ENGINEERING AND AUTOMATION I, PTS 1-3, 2011, 139-141 : 1073 - 1078
  • [10] On robust asymmetric Lagrangian ν-twin support vector regression using pinball loss function
    Gupta, Deepak
    Gupta, Umesh
    APPLIED SOFT COMPUTING, 2021, 102