Robust regression under the general framework of bounded loss functions

被引:13
|
作者
Fu, Saiji [1 ]
Tian, Yingjie [2 ,3 ,4 ,5 ]
Tang, Long [6 ]
机构
[1] Beijing Univ Posts & Telecommun, Sch Econ & Management, Beijing 100876, Peoples R China
[2] Univ Chinese Acad Sci, Sch Econ & Management, Beijing 100190, Peoples R China
[3] Chinese Acad Sci, Res Ctr Fictitious Econ & Data Sci, Beijing 100190, Peoples R China
[4] Chinese Acad Sci, Key Lab Big Data Min & Knowledge Management, Beijing 100190, Peoples R China
[5] MOE Social Sci Lab Digital Econ Forecasts & Policy, Beijing 100190, Peoples R China
[6] Nanjing Univ Informat Sci & Technol, Sch Artificial Intelligence, Nanjing 210044, Peoples R China
基金
中国国家自然科学基金;
关键词
Robustness and sensitivity analysis; Bounded loss function; Regression; Least squares loss function; Support vector regression; SUPPORT VECTOR REGRESSION; MACHINE; SPARSE; BLINEX;
D O I
10.1016/j.ejor.2023.04.025
中图分类号
C93 [管理学];
学科分类号
12 ; 1201 ; 1202 ; 120202 ;
摘要
Conventional regression methods often fail when encountering noise. The application of a bounded loss function is an effective means to enhance regressor robustness. However, most bounded loss functions ex-ist in Ramp-style forms, losing some inherent properties of the original function due to hard truncation. Besides, there is currently no unified framework on how to design bounded loss functions. In response to the above two issues, this paper proposes a general framework that can smoothly and adaptively bound any non-negative function. It can not only degenerate to the original function, but also inherit its elegant properties, including symmetry, differentiability and smoothness. Under this framework, a robust regres-sor called bounded least squares support vector regression (BLSSVR) is proposed to mitigate the effects of noise and outliers by limiting the maximum loss. With appropriate parameters, the bounded least squares loss grows faster than its unbounded form in the initial stage, which facilitates BLSSVR to assign larger weights to non-outlier points. Meanwhile, the Nesterov accelerated gradient (NAG) algorithm is employed to optimize BLSSVR. Extensive experiments on synthetic and real-world datasets profoundly demonstrate the superiority of BLSSVR over benchmark methods. & COPY; 2023 Elsevier B.V. All rights reserved.
引用
收藏
页码:1325 / 1339
页数:15
相关论文
共 50 条
  • [31] Early Stopping for Iterative Regularization with General Loss Functions
    Hu, Ting
    Lei, Yunwen
    JOURNAL OF MACHINE LEARNING RESEARCH, 2022, 23
  • [32] A practical support vector regression algorithm and kernel function for attritional general insurance loss estimation
    Kwasa, Shadrack
    Jones, Daniel
    ANNALS OF ACTUARIAL SCIENCE, 2021, 15 (02) : 394 - 418
  • [33] Robust twin support vector regression based on Huber loss function
    S. Balasundaram
    Subhash Chandra Prasad
    Neural Computing and Applications, 2020, 32 : 11285 - 11309
  • [34] Pose robust face recognition based on kernel regression in Bayesian framework
    Chen, Ying
    Zhang, Longyuan
    Guo, Xiuxiao
    INTERNATIONAL JOURNAL OF COMPUTER APPLICATIONS IN TECHNOLOGY, 2014, 49 (3-4) : 306 - 315
  • [35] The equivalence of Bayes and robust Bayes estimators for various loss functions
    Kaminska, Agnieszka
    STATISTICAL PAPERS, 2010, 51 (01) : 179 - 191
  • [36] Robust twin support vector regression based on rescaled Hinge loss
    Singla, Manisha
    Ghosh, Debdas
    Shukla, K. K.
    Pedrycz, Witold
    PATTERN RECOGNITION, 2020, 105
  • [37] Model Averaging Under Flexible Loss Functions
    Gu, Dieqi
    Liu, Qingfeng
    Zhang, Xinyu
    INFORMS JOURNAL ON COMPUTING, 2025,
  • [38] Reprint of: Robust inference on correlation under general heterogeneity
    Giraitis, Liudas
    Li, Yufei
    Phillips, Peter C. B.
    JOURNAL OF ECONOMETRICS, 2024, 244 (02)
  • [39] Robust regularized extreme learning machine for regression with non-convex loss function via DC program
    Wang, Kuaini
    Pei, Huimin
    Cao, Jinde
    Zhong, Ping
    JOURNAL OF THE FRANKLIN INSTITUTE-ENGINEERING AND APPLIED MATHEMATICS, 2020, 357 (11): : 7069 - 7091
  • [40] On robust twin support vector regression in primal using squared pinball loss
    Anagha, P.
    Balasundaram, S.
    Meena, Yogendra
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2018, 35 (05) : 5231 - 5239