A regularization on Lagrangian twin support vector regression

被引:35
|
作者
Tanveer, M. [1 ]
Shubham, K. [2 ]
机构
[1] LNM Inst Informat Technol, Dept Comp Sci & Engn, Jaipur 302031, Rajasthan, India
[2] LNM Inst Informat Technol, Dept Elect & Commun Engn, Jaipur 302031, Rajasthan, India
关键词
Machine learning; Lagrangian support vector machines; Twin support vector regression; Iterative method; MACHINE; CLASSIFICATION;
D O I
10.1007/s13042-015-0361-6
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Twin support vector regression (TSVR), Lagrangian TSVR (LTSVR) and -TSVR obtain good generalization and faster computational speed by solving a pair of smaller sized quadratic programming problems (QPPs) than a single large QPP in support vector regression (SVR). In this paper, a simple and linearly convergent Lagrangian support vector machine algorithm for the dual of the -TSVR is proposed. The contributions of our formulation are as follows: (1) we consider the square of the 2-norm of the vector of slack variables instead of the usual 1-norm to make the objective functions strongly convex. (2) We are solving regression problem with just two systems of linear equations as opposed to solving two QPPs in -TSVR and TSVR or one large QPP in SVR, which leads to extremely simple and fast algorithm. (3) One significant advantage of our proposed method is the implementation of structural risk minimization principle. However, only empirical risk is considered in the primal problems of TSVR and LTSVR due to its complex structure and thus may incur overfitting and suboptimal in some cases. (4) The experimental results on several artificial and benchmark datasets show the effectiveness of our proposed formulation.
引用
收藏
页码:807 / 821
页数:15
相关论文
共 50 条
  • [31] Modified twin support vector regression
    Parastalooi, Nafiseh
    Amiri, Ali
    Aliheidari, Parisa
    NEUROCOMPUTING, 2016, 211 : 84 - 97
  • [32] Smooth twin support vector regression
    Chen, Xiaobo
    Yang, Jian
    Liang, Jun
    Ye, Qiaolin
    NEURAL COMPUTING & APPLICATIONS, 2012, 21 (03): : 505 - 513
  • [33] On a new approach for Lagrangian support vector regression
    S. Balasundaram
    Gagandeep Benipal
    Neural Computing and Applications, 2018, 29 : 533 - 551
  • [34] On a new approach for Lagrangian support vector regression
    Balasundaram, S.
    Benipal, Gagandeep
    NEURAL COMPUTING & APPLICATIONS, 2018, 29 (09): : 533 - 551
  • [35] Estimation of the regularization parameter for support vector regression
    Jordaan, EM
    Smits, GF
    PROCEEDING OF THE 2002 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-3, 2002, : 2192 - 2197
  • [36] Lagrangian twin support vector regression and genetic algorithm based robust grayscale image watermarking
    Yadav, Ashok Kumar
    Mehta, Rajesh
    Kumar, Raj
    Vishwakarma, Virendra P.
    MULTIMEDIA TOOLS AND APPLICATIONS, 2016, 75 (15) : 9371 - 9394
  • [37] Lagrangian twin support vector regression and genetic algorithm based robust grayscale image watermarking
    Ashok Kumar Yadav
    Rajesh Mehta
    Raj Kumar
    Virendra P. Vishwakarma
    Multimedia Tools and Applications, 2016, 75 : 9371 - 9394
  • [38] A New Fast Twin Support Vector Regression
    Fan, Ying
    Shi, Yilin
    Kang, Kai
    Zheng, Fengde
    Su, Peng
    Yang, Jie
    PROCEEDINGS OF 2020 IEEE 11TH INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING AND SERVICE SCIENCE (ICSESS 2020), 2020, : 489 - 492
  • [39] Twin support vector regression with Huber loss
    Niu, Jiayi
    Chen, Jing
    Xu, Yitian
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2017, 32 (06) : 4247 - 4258
  • [40] Accurate incremental ε-twin support vector regression
    Cao J.
    Gu B.-J.
    Pan F.
    Xiong W.-L.
    Kongzhi Lilun Yu Yingyong/Control Theory and Applications, 2022, 39 (06): : 1020 - 1032