A Novel Method for Minimizing Loss of Accuracy in Naive Bayes Classifier

被引:0
作者
Netti, Kalyan [1 ]
Radhika, Y. [2 ]
机构
[1] NGRI, Hyderabad, Andhra Pradesh, India
[2] GITAM Univ, Dept CSE, Visakhapatnam, Andhra Pradesh, India
来源
2015 IEEE INTERNATIONAL CONFERENCE ON COMPUTATIONAL INTELLIGENCE AND COMPUTING RESEARCH (ICCIC) | 2015年
关键词
Data Mining; Classification; Naive Bayes Classifier; Conditional Independence; Standard Deviation; Smoothing; Accuracy;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In Data Mining classification plays prominent role in predicting outcomes. One of the best supervised classification techniques in Data Mining is Naive Bayes Classification. Naive Bayes Classification is good at predicting outcomes and often outperforms other classification techniques. One of the reasons behind the strong performance of Naive Bayes Classification is due to the assumption of conditional Independence among predictors. However, this very strong assumption leads to loss of accuracy. In this paper, the authors are proposing a novel method for improving accuracy in Naive Bayes Classifier. The proposed novel technique used in NBC gave better accuracy even with Conditional Independence.
引用
收藏
页码:364 / 367
页数:4
相关论文
共 9 条
  • [1] [Anonymous], 2002, Data mining: Introductory and advanced topics
  • [2] Domingos P., 1996, Proceedings of the 13th International Conference on Machine Learning, P105
  • [3] Haleem H., 2014, 5 INT C COMP COMM TE
  • [4] Han J, 2012, MOR KAUF D, P1
  • [5] Janasthar S., 2014, INT COMP SCI ENG C I
  • [6] Kuzma H, DATA MINING, P47
  • [7] Soon Lay-Ki, 2007, EXPLORATIVE DATA MIN, P562
  • [8] Wang Xi-Zhao, 2014, IEEE T CYBERNETICS, V44
  • [9] Wilson M L, 2009, P 6 INT WORKSH SEM W