Local feature weighting in nearest prototype classification

被引:30
作者
Fernandez, Fernando [1 ]
Isasi, Pedro [1 ]
机构
[1] Univ Carlos III Madrid, Dept Informat, Madrid 28911, Spain
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2008年 / 19卷 / 01期
关键词
evolutionary learning; local feature weighting (LFW); nearest prototype (NP) classification; weighted Euclidean distance;
D O I
10.1109/TNN.2007.902955
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The distance metric is the corner stone of nearest neighbor (NN)-based methods, and therefore, of nearest prototype (NP) algorithms. That is because they classify depending on the similarity of the data. When the data is characterized by a set of features which may contribute to the classification task in different levels, feature weighting or selection is required, sometimes in a local sense. However, local weighting is typically restricted to NN approaches. In this paper, we introduce local feature weighting (LFW) in NP classification. LFW provides each prototype its own weight vector, opposite to typical global weighting methods found in the, NP literature, where all the prototypes share the same one. Providing each prototype its own weight vector has a novel effect in the borders of the Voronoi regions generated: They become nonlinear. We have integrated LFW with a previously developed evolutionary nearest prototype classifier (ENPC). The experiments performed both in artificial and real data sets demonstrate that the resulting algorithm that we call LFW in nearest prototype classification (LFW-NPC) avoids overfitting on training data in domains where the features may have different contribution to the classification task in different areas of the feature space. This generalization capability is also reflected in automatically obtaining an accurate and reduced set of prototypes.
引用
收藏
页码:40 / 53
页数:14
相关论文
共 56 条
[1]  
AHA DW, 1992, PROCEEDINGS OF THE FOURTEENTH ANNUAL CONFERENCE OF THE COGNITIVE SCIENCE SOCIETY, P534
[2]  
[Anonymous], 2005, Data Mining Pratical Machine Learning Tools and Techniques
[3]  
[Anonymous], 1961, Adaptive Control Processes: a Guided Tour, DOI DOI 10.1515/9781400874668
[4]  
[Anonymous], ADV NEURAL INFORM PR
[5]  
Atkeson CG, 1997, ARTIF INTELL REV, V11, P11, DOI 10.1023/A:1006559212014
[6]  
Bach F.R., 2002, J MACHINE LEARNING R, V3, P1
[7]   The parameterless self-organizing map algorithm [J].
Berglund, E ;
Sitte, J .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2006, 17 (02) :305-316
[8]   A batch learning vector quantization algorithm for nearest neighbour classification [J].
Bermejo, S ;
Cabestany, J .
NEURAL PROCESSING LETTERS, 2000, 11 (03) :173-184
[9]   Nearest prototype classifier designs: An experimental study [J].
Bezdek, JC ;
Kuncheva, LI .
INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2001, 16 (12) :1445-1473
[10]  
Blake C.L., 1998, UCI repository of machine learning databases