Random subspace evidence classifier

被引:16
作者
Li, Haisheng [1 ]
Wen, Guihua [1 ]
Yu, Zhiwen [1 ]
Zhou, Tiangang [2 ]
机构
[1] S China Univ Technol, Sch Comp Sci & Engn, Guangzhou 510006, Guangdong, Peoples R China
[2] Chinese Acad Sci, Inst Psychol, State Key Lab Brain & Cognit Sci, Beijing 100101, Peoples R China
基金
美国国家科学基金会;
关键词
Evidence theory; Nearest neighbors; Local hyperplane; Random subspace; LOCAL HYPERPLANE; NEIGHBOR; RULE;
D O I
10.1016/j.neucom.2012.11.019
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Although there exist a lot of k-nearest neighbor approaches and their variants, few of them consider how to make use of the information in both the whole feature space and subspaces. In order to address this limitation, we propose a new classifier named as the random subspace evidence classifier (RSEC). Specifically, RSEC first calculates the local hyperplane distance for each class as the evidences not only in the whole feature space, but also in randomly generated feature subspaces. Then, the basic belief assignment is computed according to these distances for the evidences of each class. In the following, all the evidences represented by basic belief assignments are pooled together by the Dempster's rule. Finally, RSEC assigns the class label to each test sample based on the combined belief assignment. The experiments in the datasets from UCI machine learning repository, artificial data and face image database illustrate that the proposed approach yields lower classification error in average comparing to 7 existing k-nearest neighbor approaches and variants when performing the classification task. In addition, RSEC has good performance in average on the high dimensional data and the minority class of the imbalanced data. (C) 2013 Elsevier B.V. All rights reserved.
引用
收藏
页码:62 / 69
页数:8
相关论文
共 25 条
[1]   Ensembling evidential k-nearest neighbor classifiers through multi-modal perturbation [J].
Altincay, Hakan .
APPLIED SOFT COMPUTING, 2007, 7 (03) :1072-1083
[2]  
[Anonymous], 2007, Uci machine learning repository
[3]  
Breiman L, 1998, ANN STAT, V26, P801
[4]   Learning from partially supervised data using mixture models and belief functions [J].
Come, E. ;
Oukhellou, L. ;
Denoeux, T. ;
Aknin, P. .
PATTERN RECOGNITION, 2009, 42 (03) :334-348
[5]   A K-NEAREST NEIGHBOR CLASSIFICATION RULE-BASED ON DEMPSTER-SHAFER THEORY [J].
DENOEUX, T .
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS, 1995, 25 (05) :804-813
[6]   A neural network classifier based on Dempster-Shafer theory [J].
Denoeux, T .
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART A-SYSTEMS AND HUMANS, 2000, 30 (02) :131-150
[7]   Analysis of evidence theoretic decision rules for pattern classification [J].
Denoeux, T .
PATTERN RECOGNITION, 1997, 30 (07) :1095-1107
[8]   Locally adaptive metric nearest-neighbor classification [J].
Domeniconi, C ;
Peng, J ;
Gunopulos, D .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2002, 24 (09) :1281-1285
[9]   Completely Lazy Learning [J].
Garcia, Eric K. ;
Feldman, Sergey ;
Gupta, Maya R. ;
Srivastava, Santosh .
IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2010, 22 (09) :1274-1285
[10]   Evolutionary-based selection of generalized instances for imbalanced classification [J].
Garcia, Salvador ;
Derrac, Joaquin ;
Triguero, Isaac ;
Carmona, Cristobal J. ;
Herrera, Francisco .
KNOWLEDGE-BASED SYSTEMS, 2012, 25 (01) :3-12