A Novel Multiple Classifier Generation and Combination Framework Based on Fuzzy Clustering and Individualized Ensemble Construction

被引:1
作者
Gao, Zhen [1 ]
Zand, Maryam [1 ]
Ruan, Jianhua [1 ]
机构
[1] Univ Texas San Antonio, Dept Comp Sci, San Antonio, TX 78249 USA
来源
2019 IEEE INTERNATIONAL CONFERENCE ON DATA SCIENCE AND ADVANCED ANALYTICS (DSAA 2019) | 2019年
基金
美国国家科学基金会; 美国国家卫生研究院;
关键词
Classification; Multiple classifier system; Ensemble Learning; Model Selection; Instance Selection; SELECTION; MIXTURES;
D O I
10.1109/DSAA.2019.00038
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Multiple classifier system (MCS) has become a successful alternative for improving classification performance. However, studies have shown inconsistent results for different MCSs, and it is often difficult to predict which MCS algorithm works the best on a particular problem. We believe that the two crucial steps of MCS - base classifier generation and multiple classifier combination, need to be designed coordinately to produce robust results. In this work, we show that for different testing instances, better classifiers may be trained from different subdomains of training instances including, for example, neighboring instances of the testing instance, or even instances far away from the testing instance. To utilize this intuition, we propose Individualized Classifier Ensemble (ICE). ICE groups training data into overlapping clusters, builds a classifier for each cluster, and then associates each training instance to the top-performing models while taking into account model types and frequency. In testing, ICE finds the k most similar training instances for a testing instance, then predicts class label of the testing instance by averaging the prediction from models associated with these training instances. Evaluation results on 49 benchmarks show that ICE has a stable improvement on a significant proportion of datasets over existing MCS methods. ICE provides a novel choice of utilizing internal patterns among instances to improve classification, and can be easily combined with various classification models and applied to many application domains.
引用
收藏
页码:231 / 240
页数:10
相关论文
共 25 条
  • [1] [Anonymous], 2008, 2008 Seventh international conference on machine learning and applications
  • [2] Random forests
    Breiman, L
    [J]. MACHINE LEARNING, 2001, 45 (01) : 5 - 32
  • [3] Breiman L, 1996, MACH LEARN, V24, P123, DOI 10.1023/A:1018054314350
  • [4] Dynamic classifier selection: Recent advances and perspectives
    Cruz, Rafael M. O.
    Sabourin, Robert
    Cavalcanti, George D. C.
    [J]. INFORMATION FUSION, 2018, 41 : 195 - 216
  • [5] META-DES: A dynamic ensemble selection framework using meta-learning
    Cruz, Rafael M. O.
    Sabourin, Robert
    Cavalcanti, George D. C.
    Ren, Tsang Ing
    [J]. PATTERN RECOGNITION, 2015, 48 (05) : 1925 - 1935
  • [6] Fernández-Delgado M, 2014, J MACH LEARN RES, V15, P3133
  • [7] A decision-theoretic generalization of on-line learning and an application to boosting
    Freund, Y
    Schapire, RE
    [J]. JOURNAL OF COMPUTER AND SYSTEM SCIENCES, 1997, 55 (01) : 119 - 139
  • [8] Dynamic classifier selection based on multiple classifier behaviour
    Giacinto, G
    Roli, F
    [J]. PATTERN RECOGNITION, 2001, 34 (09) : 1879 - 1881
  • [9] Giacinto G., 1999, Proceedings 10th International Conference on Image Analysis and Processing, P659, DOI 10.1109/ICIAP.1999.797670
  • [10] Adaptive Mixtures of Local Experts
    Jacobs, Robert A.
    Jordan, Michael I.
    Nowlan, Steven J.
    Hinton, Geoffrey E.
    [J]. NEURAL COMPUTATION, 1991, 3 (01) : 79 - 87