A Lazy Learning Approach for Building Classification Models

被引:11
作者
Galvan, Ines M. [1 ]
Valls, Jose M. [1 ]
Garcia, Miguel [1 ]
Isasi, Pedro [1 ]
机构
[1] Univ Carlos III Madrid, Dept Comp Sci, Madrid 28911, Spain
关键词
D O I
10.1002/int.20493
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we propose a lazy learning strategy for building classification learning models. Instead of learning the models with the whole training data set before observing the new instance, a selection of patterns is made depending on the new query received and a classification model is learnt with those selected patterns. The selection of patterns is not homogeneous, in the sense that the number of selected patterns depends on the position of the query instance in the input space. That selection is made using a weighting function to give more importance to the training patterns that are more similar to the query instance. Our intention is to provide a lazy learning mechanism suited to any machine learning classification algorithm. For this reason, we study two different methods to avoid fixing any parameter. Experimental results show that classification rates of traditional machine learning algorithms based on trees, rules, or functions can be improved when they are learnt with the lazy learning approach proposed. (C) 2011 Wiley Periodicals, Inc.
引用
收藏
页码:773 / 786
页数:14
相关论文
共 12 条
[1]  
AHA DW, 1991, MACH LEARN, V6, P37, DOI 10.1007/BF00153759
[2]  
ATKENSON C, 1997, 3RTIF INTELL REV, V11, P11
[3]   LOCAL LEARNING ALGORITHMS [J].
BOTTOU, L ;
VAPNIK, V .
NEURAL COMPUTATION, 1992, 4 (06) :888-900
[4]  
Dasarathy B.V., 1991, NEAREST NEIGHBOUR NN
[5]  
GALVAN IM, 2009, 5 IFIP C ART INT APP, V296, P517
[6]  
GALVAN IM, 2001, INT J NEURAL SYST, V10, P167
[7]  
LANGLEY P, 1992, NAT C ART INT
[8]  
Quinlan J.R., 1993, C4.5 : programs for machine learning
[9]  
Valls JM, 2007, AI COMMUN, V20, P71
[10]  
Vapnik V., 1998, Statistical Learning Theory, P5