Incremental learning for ν-Support Vector Regression

被引:409
作者
Gu, Bin [1 ,2 ,3 ,4 ]
Sheng, Victor S. [5 ]
Wang, Zhijie [6 ]
Ho, Derek [7 ]
Osman, Said [8 ]
Li, Shuo [4 ,6 ]
机构
[1] Nanjing Univ Informat Sci & Technol, Jiangsu Engn Ctr Network Monitoring, Nanjing, Jiangsu, Peoples R China
[2] Jiangsu Collaborat Innovat Ctr Atmospher Environm, Nanjing, Jiangsu, Peoples R China
[3] Nanjing Univ Informat Sci & Technol, Sch Comp & Software, Nanjing, Jiangsu, Peoples R China
[4] Univ Western Ontario, Dept Med Biophys, London, ON, Canada
[5] Univ Cent Arkansas, Dept Comp Sci, Conway, AR USA
[6] GE Hlth Care, London, ON, Canada
[7] Victoria Hosp, London Hlth Sci Ctr, London, ON N6A 4G5, Canada
[8] St Josephs Hlth Care, London, ON, Canada
基金
美国国家科学基金会; 中国国家自然科学基金;
关键词
Incremental learning; Online learning; nu-Support Vector Regression; Support vector machine; MACHINE; PATH;
D O I
10.1016/j.neunet.2015.03.013
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The nu-Support Vector Regression (nu-SVR) is an effective regression learning algorithm, which has the advantage of using a parameter nu on controlling the number of support vectors and adjusting the width of the tube automatically. However, compared to nu-Support Vector Classification (nu-SVC) (Scholkopf et al., 2000), nu-SVR introduces an additional linear term into its objective function. Thus, directly applying the accurate on-line nu-SVC algorithm (AONSVM) to nu-SVR will not generate an effective initial solution. It is the main challenge to design an incremental nu-SVR learning algorithm. To overcome this challenge, we propose a special procedure called initial adjustments in this paper. This procedure adjusts the weights of nu-SVC based on the Karush-Kuhn-Tucker (KKT) conditions to prepare an initial solution for the incremental learning. Combining the initial adjustments with the two steps of AONSVM produces an exact and effective incremental nu-SVR learning algorithm (INSVR). Theoretical analysis has proven the existence of the three key inverse matrices, which are the cornerstones of the three steps of INSVR (including the initial adjustments), respectively. The experiments on benchmark datasets demonstrate that INSVR can avoid the infeasible updating paths as far as possible, and successfully converges to the optimal solution. The results also show that INSVR is faster than batch nu-SVR algorithms with both cold and warm starts. (C) 2015 Elsevier Ltd. All rights reserved.
引用
收藏
页码:140 / 150
页数:11
相关论文
共 36 条
[1]  
[Anonymous], 2010, UCI Machine Learning Repository
[2]  
[Anonymous], 2001, J. Am. Stat. Assoc.
[3]  
[Anonymous], 1964, THEORY MATRICES NUME
[4]  
Boyd Stephen, 2004, Convex optimization, DOI 10.1017/CBO9780511804441
[5]   Support vector machine with adaptive parameters in financial time series forecasting [J].
Cao, LJ ;
Tay, FEH .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2003, 14 (06) :1506-1518
[6]  
Cauwenberghs G, 2001, ADV NEUR IN, V13, P409
[7]   Training ν-support vector regression:: Theory and algorithms [J].
Chang, CC ;
Lin, CJ .
NEURAL COMPUTATION, 2002, 14 (08) :1959-1977
[8]   LIBSVM: A Library for Support Vector Machines [J].
Chang, Chih-Chung ;
Lin, Chih-Jen .
ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2011, 2 (03)
[9]   A tutorial on v-support vector machines [J].
Chen, PH ;
Lin, CJ ;
Schölkopf, B .
APPLIED STOCHASTIC MODELS IN BUSINESS AND INDUSTRY, 2005, 21 (02) :111-136
[10]  
Diehl CP, 2003, IEEE IJCNN, P2685