Robust regression under the general framework of bounded loss functions

被引:13
|
作者
Fu, Saiji [1 ]
Tian, Yingjie [2 ,3 ,4 ,5 ]
Tang, Long [6 ]
机构
[1] Beijing Univ Posts & Telecommun, Sch Econ & Management, Beijing 100876, Peoples R China
[2] Univ Chinese Acad Sci, Sch Econ & Management, Beijing 100190, Peoples R China
[3] Chinese Acad Sci, Res Ctr Fictitious Econ & Data Sci, Beijing 100190, Peoples R China
[4] Chinese Acad Sci, Key Lab Big Data Min & Knowledge Management, Beijing 100190, Peoples R China
[5] MOE Social Sci Lab Digital Econ Forecasts & Policy, Beijing 100190, Peoples R China
[6] Nanjing Univ Informat Sci & Technol, Sch Artificial Intelligence, Nanjing 210044, Peoples R China
基金
中国国家自然科学基金;
关键词
Robustness and sensitivity analysis; Bounded loss function; Regression; Least squares loss function; Support vector regression; SUPPORT VECTOR REGRESSION; MACHINE; SPARSE; BLINEX;
D O I
10.1016/j.ejor.2023.04.025
中图分类号
C93 [管理学];
学科分类号
12 ; 1201 ; 1202 ; 120202 ;
摘要
Conventional regression methods often fail when encountering noise. The application of a bounded loss function is an effective means to enhance regressor robustness. However, most bounded loss functions ex-ist in Ramp-style forms, losing some inherent properties of the original function due to hard truncation. Besides, there is currently no unified framework on how to design bounded loss functions. In response to the above two issues, this paper proposes a general framework that can smoothly and adaptively bound any non-negative function. It can not only degenerate to the original function, but also inherit its elegant properties, including symmetry, differentiability and smoothness. Under this framework, a robust regres-sor called bounded least squares support vector regression (BLSSVR) is proposed to mitigate the effects of noise and outliers by limiting the maximum loss. With appropriate parameters, the bounded least squares loss grows faster than its unbounded form in the initial stage, which facilitates BLSSVR to assign larger weights to non-outlier points. Meanwhile, the Nesterov accelerated gradient (NAG) algorithm is employed to optimize BLSSVR. Extensive experiments on synthetic and real-world datasets profoundly demonstrate the superiority of BLSSVR over benchmark methods. & COPY; 2023 Elsevier B.V. All rights reserved.
引用
收藏
页码:1325 / 1339
页数:15
相关论文
共 50 条
  • [21] A robust algorithm of support vector regression with a trimmed Huber loss function in the primal
    Chuanfa Chen
    Changqing Yan
    Na Zhao
    Bin Guo
    Guolin Liu
    Soft Computing, 2017, 21 : 5235 - 5243
  • [22] A General and Robust Framework for Secondary Traits Analysis
    Song, Xiaoyu
    Ionita-Laza, Iuliana
    Liu, Mengling
    Reibman, Joan
    We, Ying
    GENETICS, 2016, 202 (04) : 1329 - 1343
  • [23] Training robust support vector regression with smooth non-convex loss function
    Zhong, Ping
    OPTIMIZATION METHODS & SOFTWARE, 2012, 27 (06) : 1039 - 1058
  • [24] Robust Support Vector Regression in Primal with Asymmetric Huber Loss
    Balasundaram, S.
    Meena, Yogendra
    NEURAL PROCESSING LETTERS, 2019, 49 (03) : 1399 - 1431
  • [25] Combination of loss functions for robust breast cancer prediction
    Hajiabadi, Hamideh
    Babaiyan, Vahide
    Zabihzadeh, Davood
    Hajiabadi, Moein
    COMPUTERS & ELECTRICAL ENGINEERING, 2020, 84
  • [26] Robust Extreme Learning Machines with Different Loss Functions
    Ren, Zhuo
    Yang, Liming
    NEURAL PROCESSING LETTERS, 2019, 49 (03) : 1543 - 1565
  • [27] Robust Support Vector Regression with Generalized Loss Function and Applications
    Wang, Kuaini
    Zhu, Wenxin
    Zhong, Ping
    NEURAL PROCESSING LETTERS, 2015, 41 (01) : 89 - 106
  • [28] Forecasting peak electric load: Robust support vector regression with smooth nonconvex ϵ-insensitive loss
    Nie, Rujia
    Che, Jinxing
    Yuan, Fang
    Zhao, Weihua
    JOURNAL OF FORECASTING, 2024, 43 (06) : 1902 - 1917
  • [29] When fractional calculus meets robust learning: Adaptive robust loss functions
    Kurucu, Mert Can
    Guzelkaya, Mujde
    Eksin, Ibrahim
    Kumbasar, Tufan
    KNOWLEDGE-BASED SYSTEMS, 2025, 312
  • [30] Robust inference on correlation under general heterogeneity
    Giraitis, Liudas
    Li, Yufei
    Phillips, Peter C. B.
    JOURNAL OF ECONOMETRICS, 2024, 240 (01)