Filter Method Ensemble with Neural Networks

被引:1
作者
Chakraborty, Anuran [1 ]
De, Rajonya [1 ]
Chatterjee, Agneet [1 ]
Schwenker, Friedhelm [2 ]
Sarkar, Ram [1 ]
机构
[1] Jadavpur Univ, Kolkata, India
[2] Univ Ulm, Inst Neural Informat Proc, Ulm, Germany
来源
ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2019: DEEP LEARNING, PT II | 2019年 / 11728卷
关键词
Ensemble; Neural networks; Feature selection; FEATURE-SELECTION;
D O I
10.1007/978-3-030-30484-3_59
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The main concept behind designing a multiple classifier system is to combine a number of classifiers such that the resulting system succeeds to topple the individual classifiers by pooling together the decisions of all classifiers. Uniting relatively simple pattern recognition models with limited performance is commonly found in the literature. It performs better when each learner be trained well, but different learners have different working principles which adds diversity in the ensemble. In this paper, we first select three optimal subsets of features using three different filter methods namely Mutual Information (MI), Chi-square, and Anova F-Test. Then with the selected features we build three learning models using Multi-layer Perceptron (MLP) based classifier. Class membership values provided by these three classifiers for each sample are concatenated which is then fed to next MLP based classifier. Experimentation performed on five UCI Machine Learning Repository, namely Arrhythmia, Ionosphere, Hill-Valley, Waveform, Horse Colic shows the effectiveness of the proposed ensemble model.
引用
收藏
页码:755 / 765
页数:11
相关论文
共 19 条
[1]   Household Power Demand Prediction Using Evolutionary Ensemble Neural Network Pool with Multiple Network Structures [J].
Ai, Songpu ;
Chakravorty, Antorweep ;
Rong, Chunming .
SENSORS, 2019, 19 (03)
[2]   On combining classifiers using sum and product rules [J].
Alexandre, LA ;
Campilho, AC ;
Kamel, M .
PATTERN RECOGNITION LETTERS, 2001, 22 (12) :1283-1289
[3]   SmcHD1, containing a structural-maintenance-of-chromosomes hinge domain, has a critical role in X inactivation [J].
Blewitt, Marnie E. ;
Gendrel, Anne-Valerie ;
Pang, Zhenyi ;
Sparrow, Duncan B. ;
Whitelaw, Nadia ;
Craig, Jeffrey M. ;
Apedaile, Anwyn ;
Hilton, Douglas J. ;
Dunwoodie, Sally L. ;
Brockdorff, Neil ;
Kay, Graham F. ;
Whitelaw, Emma .
NATURE GENETICS, 2008, 40 (05) :663-669
[4]   XGBoost: A Scalable Tree Boosting System [J].
Chen, Tianqi ;
Guestrin, Carlos .
KDD'16: PROCEEDINGS OF THE 22ND ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2016, :785-794
[5]   META-DES: A dynamic ensemble selection framework using meta-learning [J].
Cruz, Rafael M. O. ;
Sabourin, Robert ;
Cavalcanti, George D. C. ;
Ren, Tsang Ing .
PATTERN RECOGNITION, 2015, 48 (05) :1925-1935
[6]   Ensemble methods in machine learning [J].
Dietterich, TG .
MULTIPLE CLASSIFIER SYSTEMS, 2000, 1857 :1-15
[7]   Neural network predictions of pollutant emissions from open burning of crop residues: Application to air quality forecasts in southern China [J].
Feng, Xu ;
Fu, Tzung-May ;
Cao, Hansen ;
Tian, Heng ;
Fan, Qi ;
Chen, Xiaoyang .
ATMOSPHERIC ENVIRONMENT, 2019, 204 :22-31
[8]  
Greensmith J., 2003, THESIS
[9]   A supervised machine learning algorithm for arrhythmia analysis [J].
Guvenir, HA ;
Acar, B ;
Demiroz, G ;
Cekin, A .
COMPUTERS IN CARDIOLOGY 1997, VOL 24, 1997, 24 :433-436
[10]   NEURAL NETWORK ENSEMBLES [J].
HANSEN, LK ;
SALAMON, P .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1990, 12 (10) :993-1001