Combining classifiers using nearest decision prototypes

被引:10
作者
Kheradpisheh, Saeed Reza [1 ,3 ]
Behjati-Ardakani, Fatemeh [1 ]
Ebrahimpour, Reza [2 ]
机构
[1] Univ Tehran, Dept Comp Sci, Sch Math Stat & Comp Sci, Tehran, Iran
[2] Shahid Rajaee Teacher Training Univ, Brain & Intelligent Syst Res Lab, Dept Elect & Comp Engn, Tehran, Iran
[3] Inst Res Fundamental Sci IPM, Sch Cognit Sci SCS, Tehran, Iran
关键词
Decision prototype; Decision templates; Classifier fusion; K-Nearest neighbor; MULTIPLE CLASSIFIERS; FUSION; COMBINATION; TEMPLATES; ACCURACY; NETWORK;
D O I
10.1016/j.asoc.2013.07.028
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We present a new classifier fusion method to combine soft-level classifiers with a new approach, which can be considered as a generalized decision templates method. Previous combining methods based on decision templates employ a single prototype for each class, but this global point of view mostly fails to properly represent the decision space. This drawback extremely affects the classification rate in such cases: insufficient number of training samples, island-shaped decision space distribution, and classes with highly overlapped decision spaces. To better represent the decision space, we utilize a prototype selection method to obtain a set of local decision prototypes for each class. Afterward, to determine the class of a test pattern, its decision profile is computed and then compared to all decision prototypes. In other words, for each class, the larger the numbers of decision prototypes near to the decision profile of a given pattern, the higher the chance for that class. The efficiency of our proposed method is evaluated over some well-known classification datasets suggesting superiority of our method in comparison with other proposed techniques. (C) 2013 Elsevier B.V. All rights reserved.
引用
收藏
页码:4570 / 4578
页数:9
相关论文
共 45 条
[1]   Local linear perceptrons for classification [J].
Alpaydin, E ;
Jordan, MI .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1996, 7 (03) :788-792
[2]  
[Anonymous], 2007, IEEE T NEURAL NETWOR, DOI DOI 10.1109/TNN.2007.897478
[3]  
[Anonymous], 2004, NEURAL NETWORKS
[4]  
[Anonymous], 2001, Neural Networks: A Comprehensive Foundation
[5]  
[Anonymous], COMBINING CLASSIFIER
[6]   A new fast prototype selection method based on clustering [J].
Arturo Olvera-Lopez, J. ;
Ariel Carrasco-Ochoa, J. ;
Francisco Martinez-Trinidad, J. .
PATTERN ANALYSIS AND APPLICATIONS, 2010, 13 (02) :131-141
[7]   Nearest prototype classifier designs: An experimental study [J].
Bezdek, JC ;
Kuncheva, LI .
INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2001, 16 (12) :1445-1473
[8]   MAJORITY SYSTEMS AND THE CONDORCET JURY THEOREM [J].
BOLAND, PJ .
STATISTICIAN, 1989, 38 (03) :181-189
[9]   MODELING DEPENDENCE IN SIMPLE AND INDIRECT MAJORITY SYSTEMS [J].
BOLAND, PJ ;
PROSCHAN, F ;
TONG, YL .
JOURNAL OF APPLIED PROBABILITY, 1989, 26 (01) :81-88
[10]  
Breiman L, 1996, MACH LEARN, V24, P123, DOI 10.1023/A:1018054314350