Robust regression under the general framework of bounded loss functions

被引:13
|
作者
Fu, Saiji [1 ]
Tian, Yingjie [2 ,3 ,4 ,5 ]
Tang, Long [6 ]
机构
[1] Beijing Univ Posts & Telecommun, Sch Econ & Management, Beijing 100876, Peoples R China
[2] Univ Chinese Acad Sci, Sch Econ & Management, Beijing 100190, Peoples R China
[3] Chinese Acad Sci, Res Ctr Fictitious Econ & Data Sci, Beijing 100190, Peoples R China
[4] Chinese Acad Sci, Key Lab Big Data Min & Knowledge Management, Beijing 100190, Peoples R China
[5] MOE Social Sci Lab Digital Econ Forecasts & Policy, Beijing 100190, Peoples R China
[6] Nanjing Univ Informat Sci & Technol, Sch Artificial Intelligence, Nanjing 210044, Peoples R China
基金
中国国家自然科学基金;
关键词
Robustness and sensitivity analysis; Bounded loss function; Regression; Least squares loss function; Support vector regression; SUPPORT VECTOR REGRESSION; MACHINE; SPARSE; BLINEX;
D O I
10.1016/j.ejor.2023.04.025
中图分类号
C93 [管理学];
学科分类号
12 ; 1201 ; 1202 ; 120202 ;
摘要
Conventional regression methods often fail when encountering noise. The application of a bounded loss function is an effective means to enhance regressor robustness. However, most bounded loss functions ex-ist in Ramp-style forms, losing some inherent properties of the original function due to hard truncation. Besides, there is currently no unified framework on how to design bounded loss functions. In response to the above two issues, this paper proposes a general framework that can smoothly and adaptively bound any non-negative function. It can not only degenerate to the original function, but also inherit its elegant properties, including symmetry, differentiability and smoothness. Under this framework, a robust regres-sor called bounded least squares support vector regression (BLSSVR) is proposed to mitigate the effects of noise and outliers by limiting the maximum loss. With appropriate parameters, the bounded least squares loss grows faster than its unbounded form in the initial stage, which facilitates BLSSVR to assign larger weights to non-outlier points. Meanwhile, the Nesterov accelerated gradient (NAG) algorithm is employed to optimize BLSSVR. Extensive experiments on synthetic and real-world datasets profoundly demonstrate the superiority of BLSSVR over benchmark methods. & COPY; 2023 Elsevier B.V. All rights reserved.
引用
收藏
页码:1325 / 1339
页数:15
相关论文
共 50 条
  • [41] Robust Twin Support Vector Regression with Smooth Truncated Hε Loss Function
    Shi, Ting
    Chen, Sugen
    NEURAL PROCESSING LETTERS, 2023, 55 (07) : 9179 - 9223
  • [42] A robust support vector regression with a linear-log concave loss function
    Kim, Dohyun
    Lee, Chungmok
    Hwang, Sangheum
    Jeong, Myong K.
    JOURNAL OF THE OPERATIONAL RESEARCH SOCIETY, 2016, 67 (05) : 735 - 742
  • [43] Distributed robust support vector ordinal regression under label noise
    Liu, Huan
    Tu, Jiankai
    Gao, Anqi
    Li, Chunguang
    NEUROCOMPUTING, 2024, 598
  • [44] Robust learning of Huber loss under weak conditional moment
    Huang, Shouyou
    NEUROCOMPUTING, 2022, 507 : 191 - 198
  • [45] An Optimal Control Framework for Online Job Scheduling with General Cost Functions
    Etesami, S. Rasoul
    OPERATIONS RESEARCH, 2022, : 2674 - 2701
  • [46] A robust outlier control framework for classification designed with family of homotopy loss function
    Wang, Yidan
    Yang, Liming
    Yuan, Chao
    NEURAL NETWORKS, 2019, 112 : 41 - 53
  • [47] Bayesian estimation of ridge parameter under different loss functions
    Amin, Muhammad
    Akram, Muhammad Nauman
    Ramzan, Qasim
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2022, 51 (12) : 4055 - 4071
  • [48] A NOVEL GENERAL SEMISUPERVISED DEEP LEARNING FRAMEWORK FOR CLASSIFICATION AND REGRESSION WITH REMOTE SENSING IMAGES
    Chen, Zhao
    Chen, Guangchen
    Zhou, Feng
    Yang, Bin
    Wang, Lili
    Liu, Qiong
    Chen, Yonghang
    IGARSS 2020 - 2020 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM, 2020, : 1323 - 1326
  • [49] The construction of the robust regression models with the LS-SVM method using a nonquadratic loss function
    Popov, Alexander A.
    Boboev, Sharaf A.
    2016 11TH INTERNATIONAL FORUM ON STRATEGIC TECHNOLOGY (IFOST), PTS 1 AND 2, 2016,
  • [50] An overview of asymptotic properties of Lp regression under general classes of error distributions
    Lai, RY
    Lee, SMS
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2005, 100 (470) : 446 - 458