A sparse method for least squares twin support vector regression

被引:15
作者
Huang, Huajuan [1 ]
Wei, Xiuxi [2 ]
Zhou, Yongquan [1 ]
机构
[1] Guangxi Univ Nationalities, Coll Informat Sci & Engn, Nanning 530006, Peoples R China
[2] Guangxi Int Business Vocat Coll, Dept Informat Engn, Nanning 530007, Peoples R China
关键词
TSVR; LSTSVR; Sparse solution; Input features; Linear programming; MACHINE; CLASSIFICATION; ALGORITHM;
D O I
10.1016/j.neucom.2015.12.133
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recently, some nonparallel plane regressors, such as twin support vector regression (TSVR), and least squares TSVR (LSTSVR), have been proposed and have attracted much attention. However, these algorithms are not sparse, which would make their learning speed low. In this paper, we propose a novel nonparallel plane regressor, which can automatically select the relevant features. Firstly, we introduce a regularization term to the objective function of LSTSVR, which can guarantee two quadratic programming problems (QPPs) are strong convex, implying that the proposed algorithm can obtain the global but unique solution. Secondly, the primal formulation is converted to a linear programming (LP) problem. Then, we solve the dual of the LP formulation by minimizing its exterior penalty problem, which would make our method yield very sparse solutions. In other words, this method can suppress input features so that it can obtain comparable regression performance when using fewer computational time. Numerical experiments on artificial dataset and benchmark datasets demonstrate the feasibility and validity of the proposed algorithm. (C) 2016 Elsevier B.V. All rights reserved.
引用
收藏
页码:150 / 158
页数:9
相关论文
共 26 条
[1]  
[Anonymous], AISTATS
[2]   Distributed customer behavior prediction using multiplex data: A collaborative MK-SVM approach [J].
Chen, Zhen-Yu ;
Fan, Zhi-Ping .
KNOWLEDGE-BASED SYSTEMS, 2012, 35 :111-119
[3]  
CORTES C, 1995, MACH LEARN, V20, P273, DOI 10.1023/A:1022627411411
[4]   Research on the hybrid models of granular computing and support vector machine [J].
Ding, Shifei ;
Huang, Huajuan ;
Yu, Junzhao ;
Zhao, Han .
ARTIFICIAL INTELLIGENCE REVIEW, 2015, 43 (04) :565-577
[5]   1-Norm least squares twin support vector machines [J].
Gao, Shangbing ;
Ye, Qiaolin ;
Ye, Ning .
NEUROCOMPUTING, 2011, 74 (17) :3590-3597
[6]   Primal least squares twin support vector regression [J].
Huang, Hua-juan ;
Ding, Shi-fei ;
Shi, Zhong-zhi .
JOURNAL OF ZHEJIANG UNIVERSITY-SCIENCE C-COMPUTERS & ELECTRONICS, 2013, 14 (09) :722-732
[7]   Twin support vector machines for pattern classification [J].
Jayadeva ;
Khemchandani, R. ;
Chandra, Suresh .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2007, 29 (05) :905-910
[8]   An Optimal PID Control Algorithm for Training Feedforward Neural Networks [J].
Jing, Xingjian ;
Cheng, Li .
IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, 2013, 60 (06) :2273-2283
[9]   ε-SSVR:: A smooth support vector machine for ε-insensitive regression [J].
Lee, YJ ;
Hsieh, WF ;
Huang, CM .
IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2005, 17 (05) :678-685
[10]   Support vector machine for breast MR image classification [J].
Lo, Chien-Shun ;
Wang, Chuin-Mu .
COMPUTERS & MATHEMATICS WITH APPLICATIONS, 2012, 64 (05) :1153-1162