Robust regression framework with asymmetrically analogous to correntropy-induced loss

被引:7
|
作者
Yang, Liming [1 ]
Ding, Guangsheng [1 ]
Yuan, Chao [2 ]
Zhang, Min [1 ]
机构
[1] China Agr Univ, Coll Sci, Beijing 100083, Peoples R China
[2] China Agr Univ, Coll Informat & Elect Engn, Beijing, Peoples R China
关键词
Robustness; Asymmetry least square loss; Expectile; Nonconvexity; Correntropy; Regression; CCCP; SUPPORT VECTOR MACHINE; CLASSIFICATION; QUANTILES;
D O I
10.1016/j.knosys.2019.105211
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This work proposes a robust loss function based on expectile penalty (named as rescaled expectile loss, RE-loss), which includes and generalizes the existing loss functions. Then some important properties of RE-loss are demonstrated such as asymmetry, nonconvexity, smoothness, boundedness and asymptotic approximation behaviors. From the viewpoints of correntropy, we analyze that the proposed RE-loss can be viewed as a correntropy-induced loss by a reproducing piecewise kernel. Furthermore, a sparse version of RE-loss (called SRE-loss function) is developed to improve sparsity by introducing a epsilon-insensitive zone. Following that, two robust regression frameworks are proposed with the proposed loss functions. However, the non-convexity of the proposed losses makes the problems difficult to optimize. We apply concave-convex procedure (CCCP) and dual theory to solve the problems effectively. The resulting algorithms converge linearly. To validate the proposed methods, we carry out numerical experiments in different scale datasets with different levels of noises and, outliers, respectively. In three databases including artificial database, benchmark database and a practical application database, experimental results demonstrate that the proposed methods achieve better generalization than the traditional regression methods in most cases,especially when noise and outlier distribution are imbalance. (C) 2019 Elsevier B.V. All rights reserved.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Efficient and robust deep learning with Correntropy-induced loss function
    Chen, Liangjun
    Qu, Hua
    Zhao, Jihong
    Chen, Badong
    Principe, Jose C.
    NEURAL COMPUTING & APPLICATIONS, 2016, 27 (04) : 1019 - 1031
  • [2] Efficient and robust deep learning with Correntropy-induced loss function
    Liangjun Chen
    Hua Qu
    Jihong Zhao
    Badong Chen
    Jose C. Principe
    Neural Computing and Applications, 2016, 27 : 1019 - 1031
  • [3] Correntropy-Induced Robust Low-Rank Hypergraph
    Jin, Taisong
    Ji, Rongrong
    Gao, Yue
    Sun, Xiaoshuai
    Zhao, Xibin
    Tao, Dacheng
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2019, 28 (06) : 2755 - 2769
  • [4] Sparse Additive Machine With the Correntropy-Induced Loss
    Yuan, Peipei
    You, Xinge
    Chen, Hong
    Wang, Yingjie
    Peng, Qinmu
    Zou, Bin
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, : 1989 - 2003
  • [5] A robust semi-supervised regressor with correntropy-induced manifold regularization and adaptive graph
    Zheng, Xiaohan
    Zhang, Li
    Yan, Leilei
    Zhao, Lei
    Neural Networks, 2025, 182
  • [6] Deep particulate matter forecasting model using correntropy-induced loss
    Kim, Jongsu
    Lee, Changhoon
    JOURNAL OF MECHANICAL SCIENCE AND TECHNOLOGY, 2021, 35 (09) : 4045 - 4063
  • [7] Deep particulate matter forecasting model using correntropy-induced loss
    Jongsu Kim
    Changhoon Lee
    Journal of Mechanical Science and Technology, 2021, 35 : 4045 - 4063
  • [8] CTSVM: A robust twin support vector machine with correntropy-induced loss function for binary classification problems
    Zheng, Xiaohan
    Zhang, Li
    Yan, Leilei
    INFORMATION SCIENCES, 2021, 559 : 22 - 45
  • [9] Robust least squares twin bounded support vector machine with a generalized correntropy-induced metric
    Yuan, Chao
    Zhou, Changsheng
    Pan, Honghao
    Arvin, Farshad
    Peng, Jigen
    Li, Haiyang
    INFORMATION SCIENCES, 2025, 699
  • [10] A Robust Regression Framework with Laplace Kernel-Induced Loss
    Yang, Liming
    Ren, Zhuo
    Wang, Yidan
    Dong, Hongwei
    NEURAL COMPUTATION, 2017, 29 (11) : 3014 - 3039