A working likelihood approach to support vector regression with a data-driven insensitivity parameter

被引:0
作者
Jinran Wu
You-Gan Wang
机构
[1] Queensland University of Technology,
[2] Australian Catholic University,undefined
来源
International Journal of Machine Learning and Cybernetics | 2023年 / 14卷
关键词
Approximate loss function; Parameter estimation; Prediction; Working likelihood;
D O I
暂无
中图分类号
学科分类号
摘要
The insensitivity parameter in support vector regression determines the set of support vectors that greatly impacts the prediction. A data-driven approach is proposed to determine an approximate value for this insensitivity parameter by minimizing a generalized loss function originating from the likelihood principle. This data-driven support vector regression also statistically standardizes samples using the scale of noises different from conventional response scaling method. Statistical standardization together with probabilistic regularization based on a working likelihood function produces data-dependent values for the hyperparameters including the insensitivity parameter. The exact asymptotical solutions are provided when the noises are normally distributed. Nonlinear and linear numerical simulations with three types of noises (ϵ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\epsilon$$\end{document}-Laplacian distribution, normal distribution, and uniform distribution), and in addition, five real benchmark data sets, are used to test the capacity of the proposed method. Based on all the simulations and the five case studies, the proposed support vector regression using a working likelihood, data-driven insensitivity parameter is superior and has lower computational costs.
引用
收藏
页码:929 / 945
页数:16
相关论文
共 107 条
  • [1] Chen BJ(2004)Load forecasting using support vector machines: a study on EUNITE competition 2001 IEEE Trans Power Syst 19 1821-1830
  • [2] Chang MW(2021)Real-time sufficient dimension reduction through principal least squares support vector machines Pattern Recognit 112 107768-287
  • [3] Artemiou A(2021)Support vector regression with asymmetric loss for optimal electric load forecasting Energy 223 119969-27
  • [4] Dong Y(1996)Support vector method for function approximation, regression estimation and signal processing Adv Neural Inf Process Syst 9 281-1977
  • [5] Shin SJ(2011)LIBSVM: a library for support vector machines ACM Trans Intell Syst Technol (TIST) 2 1-161
  • [6] Wu J(2002)Training v-support vector regression: theory and algorithms Neural Comput 14 1959-336
  • [7] Wang YG(1996)Support vector regression machines Adv Neural Inf Process Syst 9 155-1245
  • [8] Tian YC(1999)Shrinking the tube: a new support vector regression algorithm Adv Neural Inf Process Syst 11 330-300
  • [9] Burrage K(2000)New support vector algorithms Neural Comput 12 1207-126
  • [10] Cao T(2003)Support vector interval regression networks for interval regression analysis Fuzzy Sets Syst 138 283-2774