Robust regression framework with asymmetrically analogous to correntropy-induced loss

被引:7
|
作者
Yang, Liming [1 ]
Ding, Guangsheng [1 ]
Yuan, Chao [2 ]
Zhang, Min [1 ]
机构
[1] China Agr Univ, Coll Sci, Beijing 100083, Peoples R China
[2] China Agr Univ, Coll Informat & Elect Engn, Beijing, Peoples R China
关键词
Robustness; Asymmetry least square loss; Expectile; Nonconvexity; Correntropy; Regression; CCCP; SUPPORT VECTOR MACHINE; CLASSIFICATION; QUANTILES;
D O I
10.1016/j.knosys.2019.105211
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This work proposes a robust loss function based on expectile penalty (named as rescaled expectile loss, RE-loss), which includes and generalizes the existing loss functions. Then some important properties of RE-loss are demonstrated such as asymmetry, nonconvexity, smoothness, boundedness and asymptotic approximation behaviors. From the viewpoints of correntropy, we analyze that the proposed RE-loss can be viewed as a correntropy-induced loss by a reproducing piecewise kernel. Furthermore, a sparse version of RE-loss (called SRE-loss function) is developed to improve sparsity by introducing a epsilon-insensitive zone. Following that, two robust regression frameworks are proposed with the proposed loss functions. However, the non-convexity of the proposed losses makes the problems difficult to optimize. We apply concave-convex procedure (CCCP) and dual theory to solve the problems effectively. The resulting algorithms converge linearly. To validate the proposed methods, we carry out numerical experiments in different scale datasets with different levels of noises and, outliers, respectively. In three databases including artificial database, benchmark database and a practical application database, experimental results demonstrate that the proposed methods achieve better generalization than the traditional regression methods in most cases,especially when noise and outlier distribution are imbalance. (C) 2019 Elsevier B.V. All rights reserved.
引用
收藏
页数:13
相关论文
共 50 条
  • [41] Robust Support Vector Regression with Generalized Loss Function and Applications
    Wang, Kuaini
    Zhu, Wenxin
    Zhong, Ping
    NEURAL PROCESSING LETTERS, 2015, 41 (01) : 89 - 106
  • [42] Robust Extreme Learning Machine Based on p-order Laplace Kernel-Induced Loss Function
    Luo, Liutao
    Wang, Kuaini
    Lin, Qiang
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2024, 15 (04) : 1281 - 1291
  • [43] Advancing robust regression: Addressing asymmetric noise with the BLINEX loss function
    Tang, Jingjing
    Liu, Bangxin
    Fu, Saiji
    Tian, Yingjie
    Kou, Gang
    INFORMATION FUSION, 2024, 110
  • [44] Robust Online Support Vector Regression with Truncated ε-Insensitive Pinball Loss
    Shan, Xian
    Zhang, Zheshuo
    Li, Xiaoying
    Xie, Yu
    You, Jinyu
    MATHEMATICS, 2023, 11 (03)
  • [45] Pose robust face recognition based on kernel regression in Bayesian framework
    Chen, Ying
    Zhang, Longyuan
    Guo, Xiuxiao
    INTERNATIONAL JOURNAL OF COMPUTER APPLICATIONS IN TECHNOLOGY, 2014, 49 (3-4) : 306 - 315
  • [46] Robust twin support vector regression based on rescaled Hinge loss
    Singla, Manisha
    Ghosh, Debdas
    Shukla, K. K.
    Pedrycz, Witold
    PATTERN RECOGNITION, 2020, 105
  • [47] SLiKER: Sparse loss induced kernel ensemble regression
    Shen, Xiang-Jun
    Ni, ChengGong
    Wang, Liangjun
    Zha, Zheng-Jun
    PATTERN RECOGNITION, 2021, 109
  • [48] On robust twin support vector regression in primal using squared pinball loss
    Anagha, P.
    Balasundaram, S.
    Meena, Yogendra
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2018, 35 (05) : 5231 - 5239
  • [49] Maximum likelihood optimal and robust Support Vector Regression with lncosh loss function
    Karal, Omer
    NEURAL NETWORKS, 2017, 94 : 1 - 12
  • [50] Robust Twin Support Vector Regression with Smooth Truncated Hε Loss Function
    Shi, Ting
    Chen, Sugen
    NEURAL PROCESSING LETTERS, 2023, 55 (07) : 9179 - 9223