Regularized least squares potential SVRs

被引:0
作者
Jayadeva [1 ]
Deb, Alok Kanti [1 ]
Kbemchandani, Reshma [2 ]
Chandra, Suresh [2 ]
机构
[1] Indian Inst Technol, Dept E E, New Delhi 110016, India
[2] Indian Inst Technol, Dept Math, New Delhi 110016, India
来源
2006 ANNUAL IEEE INDIA CONFERENCE | 2006年
关键词
approximation methods; pattern classification; function approximation; least squares methods; machine learning; regression; support vector machines;
D O I
暂无
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper, we propose a regularized least squares approach to Potential SVRs. The proposed solution involves inverting a single matrix of small dimension. In the case of linear SVRs, the size of the matrix is independent of the number of data samples. Results involving benchmark data sets demonstrate the computational advantages of the proposal. In a recent publication, it has been highlighted that the margin in Support Vector Machines (SVMs) is not scale invariant. This implies that an appropriate scaling can have an impact on the generalization performance of the SVM based regressor. Potential SVMs address this issue and suggest a new approach to regression.
引用
收藏
页码:565 / +
页数:3
相关论文
共 15 条
[1]  
[Anonymous], P KDD 2001 KNOWL DIS
[2]  
[Anonymous], 1998, (ICML-1998) Proceedings of the 15th International Confer- ence on Machine Learning
[3]  
[Anonymous], UCI REPOSITORY MACHI
[4]   Reduced rank kernel ridge regression [J].
Cawley, GC ;
Talbot, NLC .
NEURAL PROCESSING LETTERS, 2002, 16 (03) :293-302
[5]  
Cristianini N., 2000, ITNRO SUPPORT VECTOR
[6]  
HOCHREITER S, 2004, TU BERLIN
[7]  
KHEMCHANDANI R, 2005, WORKSH MATH PROGR DA
[8]  
MANGASARIAN OL, 2000, 0006 U WISC DAT MIN
[9]  
MANGASARIAN OL, 2000, 0004 U WISC DAT MIN
[10]   Active set support vector regression [J].
Musicant, DR ;
Feinberg, A .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2004, 15 (02) :268-275