Decision-forest voting scheme for classification of rare classes in network intrusion detection

被引:6
作者
Brabec, Jan [1 ,2 ]
Machlica, Lukas [1 ]
机构
[1] Cisco Syst Inc, Charles Sq Ctr, Karlovo Namesti 10 St, Prague 12000, Czech Republic
[2] Czech Tech Univ, Fac Elect Engn, Prague, Czech Republic
来源
2018 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC) | 2018年
关键词
IMBALANCED DATA;
D O I
10.1109/SMC.2018.00563
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper, Bayesian based aggregation of decision trees in an ensemble (decision forest) is investigated. The focus is laid on multi-class classification with number of samples significantly skewed toward one of the classes. The algorithm leverages out-of-bag datasets to estimate prediction errors of individual trees, which are then used in accordance with the Bayes rule to refine the decision of the ensemble. The algorithm takes prevalence of individual classes into account and does not require setting of any additional parameters related to class weights or decision-score thresholds. Evaluation is based on publicly available datasets as well as on an proprietary dataset comprising network traffic telemetry from hundreds of enterprise networks with over a million of users overall. The aim is to increase the detection capabilities of an operating malware detection system. While we were able to keep precision of the system higher than 94%, that is only 6 out of 100 detections shown to the network administrator are false alarms, we were able to achieve increase of approximately 7% in the number of detections. The algorithm effectively handles large amounts of data, and can be used in conjunction with most of the state-ofthe-art algorithms used to train decision forests.
引用
收藏
页码:3325 / 3330
页数:6
相关论文
共 33 条
[1]  
[Anonymous], 2016, The Journal of Machine Learning Research, DOI DOI 10.1145/2882903.2912565
[2]  
[Anonymous], 2012, TELKOMNIKA Indonesian J Electr Eng, DOI DOI 10.11591/TELKOMNIKA.V10I6.1323
[3]  
Bartos Karel, 2015, ROBUST REPRESENTATIO, P116
[4]  
Blaser R, 2016, J MACH LEARN RES, V17
[5]   Random forests [J].
Breiman, L .
MACHINE LEARNING, 2001, 45 (01) :5-32
[6]  
Breiman L., 1994, BAGGING PREDICTORS
[7]   LIBSVM: A Library for Support Vector Machines [J].
Chang, Chih-Chung ;
Lin, Chih-Jen .
ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2011, 2 (03)
[8]   SMOTE: Synthetic minority over-sampling technique [J].
Chawla, Nitesh V. ;
Bowyer, Kevin W. ;
Hall, Lawrence O. ;
Kegelmeyer, W. Philip .
2002, American Association for Artificial Intelligence (16)
[9]  
Chen C., 2004, USING RANDOM FOREST, V110, P24
[10]   A Parallel Random Forest Algorithm for Big Data in a Spark Cloud Computing Environment [J].
Chen, Jianguo ;
Li, Kenli ;
Tang, Zhuo ;
Bilal, Kashif ;
Yu, Shui ;
Weng, Chuliang ;
Li, Keqin .
IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2017, 28 (04) :919-933