Learning with the Maximum Correntropy Criterion Induced Losses for Regression

被引:0
|
作者
Feng, Yunlong [1 ]
Huang, Xiaolin [1 ]
Shi, Lei [2 ]
Yang, Yuning [1 ]
Suykens, Johan A. K. [1 ]
机构
[1] Katholieke Univ Leuven, ESAT STADIUS, Dept Elect Engn, Kasteelpk Arenberg 10, B-3001 Leuven, Belgium
[2] Fudan Univ, Sch Math Sci, Shanghai Key Lab Contemporary Appl Math, Shanghai 200433, Peoples R China
基金
中国国家自然科学基金; 欧洲研究理事会;
关键词
correntropy; the maximum correntropy criterion; robust regression; robust loss function; least squares regression; statistical learning theory; SUPPORT VECTOR MACHINES; KERNEL-BASED REGRESSION; LINEAR LEAST-SQUARES; ROBUSTNESS; RATES; SELECTION; ERROR;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Within the statistical learning framework, this paper studies the regression model associated with the correntropy induced losses. The correntropy, as a similarity measure, has been frequently employed in signal processing and pattern recognition. Motivated by its empirical successes, this paper aims at presenting some theoretical understanding towards the maximum correntropy criterion in regression problems. Our focus in this paper is twofold: first, we are concerned with the connections between the regression model associated with the correntropy induced loss and the least squares regression model. Second, we study its convergence property. A learning theory analysis which is centered around the above two aspects is conducted. From our analysis, we see that the scale parameter in the loss function balances the convergence rates of the regression model and its robustness. We then make some efforts to sketch a general view on robust loss functions when being applied into the learning for regression problems. Numerical experiments are also implemented to verify the effectiveness of the model.
引用
收藏
页码:993 / 1034
页数:42
相关论文
共 50 条
  • [41] Modified State Estimation with Fixed Point Update Based on Maximum Correntropy Criterion
    Maki, Hayato
    Katsura, Seiichiro
    2019 IEEE INTERNATIONAL CONFERENCE ON INDUSTRIAL TECHNOLOGY (ICIT), 2019, : 744 - 749
  • [42] Error analysis for the semi-supervised algorithm under maximum correntropy criterion
    Zuo, Ling
    Wang, Yulong
    NEUROCOMPUTING, 2017, 223 : 45 - 53
  • [43] Training extreme learning machine via regularized correntropy criterion
    Hong-Jie Xing
    Xin-Mei Wang
    Neural Computing and Applications, 2013, 23 : 1977 - 1986
  • [44] A Robust and Adaptive AUV Integrated Navigation Algorithm Based on a Maximum Correntropy Criterion
    Li, Pinchi
    Sun, Xiaona
    Chen, Ziyun
    Zhang, Xiaolin
    Yan, Tianhong
    He, Bo
    ELECTRONICS, 2024, 13 (13)
  • [45] Steady-State Tracking Analysis of Adaptive Filter With Maximum Correntropy Criterion
    Khalili, Azam
    Rastegarnia, Amir
    Islam, Md Kafiul
    Rezaii, Tohid Yousefi
    CIRCUITS SYSTEMS AND SIGNAL PROCESSING, 2017, 36 (04) : 1725 - 1734
  • [46] Robust and stable gene selection via Maximum-Minimum Correntropy Criterion
    Mohammadi, Majid
    Noghabi, Hossein Sharifi
    Hodtani, Ghosheh Abed
    Mashhadi, Habib Rajabi
    GENOMICS, 2016, 107 (2-3) : 83 - 87
  • [47] A Variable Step-Size Adaptive Algorithm under Maximum Correntropy Criterion
    Wang, Ren
    Chen, Badong
    Zheng, Nanning
    Principe, Jose C.
    2015 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2015,
  • [48] Training extreme learning machine via regularized correntropy criterion
    Xing, Hong-Jie
    Wang, Xin-Mei
    NEURAL COMPUTING & APPLICATIONS, 2013, 23 (7-8) : 1977 - 1986
  • [49] Robust regression framework with asymmetrically analogous to correntropy-induced loss
    Yang, Liming
    Ding, Guangsheng
    Yuan, Chao
    Zhang, Min
    KNOWLEDGE-BASED SYSTEMS, 2020, 191
  • [50] Robust Matrix Completion via Maximum Correntropy Criterion and Half-Quadratic Optimization
    He, Yicong
    Wang, Fei
    Li, Yingsong
    Qin, Jing
    Chen, Badong
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2020, 68 (181-195) : 181 - 195