Cost-sensitive probability for weighted voting in an ensemble model for multi-class classification problems

被引:12
作者
Rojarath, Artittayapron [1 ]
Songpan, Wararat [1 ]
机构
[1] Khon Kaen Univ, Fac Sci, Dept Comp Sci, Khon Kaen, Thailand
关键词
Ensemble learning; Multi-class data; Cost-sensitive learning; True positive; DECISION-MAKING; RANDOM FORESTS; ARTIFICIAL-INTELLIGENCE; HETEROGENEOUS ENSEMBLE; LABEL CLASSIFICATION; STACKING; NETWORKS; PERSPECTIVE; UNCERTAINTY; PERFORMANCE;
D O I
10.1007/s10489-020-02106-3
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Ensemble learning is an algorithm that utilizes various types of classification models. This algorithm can enhance the prediction efficiency of component models. However, the efficiency of combining models typically depends on the diversity and accuracy of the predicted results of ensemble models. However, the problem of multi-class data is still encountered. In the proposed approach, cost-sensitive learning was implemented to evaluate the prediction accuracy for each class, which was used to construct a cost-sensitivity matrix of the true positive (TP) rate. This TP rate can be used as a weight value and combined with a probability value to drive ensemble learning for a specified class. We proposed an ensemble model, which was a type of heterogenous model, namely, a combination of various individual classification models (support vector machine, Bayes, K-nearest neighbour, naive Bayes, decision tree, and multi-layer perceptron) in experiments on 3-, 4-, 5- and 6-classifier models. The efficiencies of the propose models were compared to those of the individual classifier model and homogenous models (Adaboost, bagging, stacking, voting, random forest, and random subspaces) with various multi-class data sets. The experimental results demonstrate that the cost-sensitive probability for the weighted voting ensemble model that was derived from 3 models provided the most accurate results for the dataset in multi-class prediction. The objective of this study was to increase the efficiency of predicting classification results in multi-class classification tasks and to improve the classification results.
引用
收藏
页码:4908 / 4932
页数:25
相关论文
共 79 条
[1]   Improving Classification Performance through an Advanced Ensemble Based Heterogeneous Extreme Learning Machines [J].
Abuassba, Adnan O. M. ;
Zhang, Dezheng ;
Luo, Xiong ;
Shaheryar, Ahmad ;
Ali, Hazrat .
COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2017, 2017
[2]   Improving multiclass classification by deep networks using DAGSVM and Triplet Loss [J].
Agarwal, Nakul ;
Balasubramanian, Vineeth N. ;
Jawahar, C., V .
PATTERN RECOGNITION LETTERS, 2018, 112 :184-190
[3]   Fast decorrelated neural network ensembles with random weights [J].
Alhamdoosh, Monther ;
Wang, Dianhui .
INFORMATION SCIENCES, 2014, 264 :104-117
[4]   The effect of dictionary learning on weight update of AdaBoost and ECG classification [J].
Barstugan, Muecahid ;
Ceylan, Rahime .
JOURNAL OF KING SAUD UNIVERSITY-COMPUTER AND INFORMATION SCIENCES, 2020, 32 (10) :1149-1157
[5]   Learning tractable Bayesian networks in the space of elimination orders [J].
Benjumeda, Marco ;
Bielza, Concha ;
Larranaga, Pedro .
ARTIFICIAL INTELLIGENCE, 2019, 274 :66-90
[6]   An iterative boosting-based ensemble for streaming data classification [J].
Bertini Junior, Joao Roberto ;
Nicoletti, Maria do Carmo .
INFORMATION FUSION, 2019, 45 :66-78
[7]   Building ground for didactics in a patient decision aid for hip osteoarthritis. Exploring patient -related barriers and facilitators towards shared decision -making [J].
Brembo, Espen Andreas ;
Eide, Hilde ;
Lauritzen, Mirjam ;
van Dulmen, Sandra ;
Kasper, Jurgen .
PATIENT EDUCATION AND COUNSELING, 2020, 103 (07) :1343-1350
[8]   Multi-label classification and extracting predicted class hierarchies [J].
Brucker, Florian ;
Benites, Fernando ;
Sapozhnikova, Elena .
PATTERN RECOGNITION, 2011, 44 (03) :724-738
[9]   Hierarchical ensemble of Extreme Learning Machine [J].
Cai, Yaoming ;
Liu, Xiaobo ;
Zhang, Yongshan ;
Cai, Zhihua .
PATTERN RECOGNITION LETTERS, 2018, 116 :101-106
[10]   Class-specific soft voting based multiple extreme learning machines ensemble [J].
Cao, Jingjing ;
Kwong, Sam ;
Wang, Ran ;
Li, Xiaodong ;
Kong, Xiangfei .
NEUROCOMPUTING, 2015, 149 :275-284