The learning and optimization of full Bayes classifiers with continuous attributes

被引:0
作者
机构
[1] School of Mathematics and Information, Shanghai Lixin University of Commerce
[2] Open Economic and Trade Research Center, Shanghai Lixin University of Commerce
来源
Wang, S.-C. (wangsc@lixin.edu.cn) | 1600年 / Science Press卷 / 35期
关键词
Continuous attributes; Dynamic full Bayes classifiers; Full Bayes classifiers; Gaussian kernel function; Smoothing parameters;
D O I
10.3724/SP.J.1016.2012.02129
中图分类号
学科分类号
摘要
The naive Bayes classifiers with continuous attributes can not make the effective use of conditional dependency information between attributes. In dependency extension of naive Bayes classifiers, it is very difficult that the optimization of attribute conditional joint density estimation and structure learning of classifiers are integrated. In this paper, on the basis of using multivariate Gaussian kernel function to estimate the conditional joint density of attributes, a full Bayes classifier with continuous attributes and multi smoothing parameters is presented. The smoothing parameters are optimized by combining the evaluation criteria of classification accuracy and full search method based on interval division with asynchronous long. A dynamic full Bayes classifier is also developed by combining full Bayes classifier with time series. Experiment and analysis are done by using data sets with continuous attributes in UCI machine learning repository and macroeconomic field. The results show that two kinds of optimized classifiers have very good classification accuracy.
引用
收藏
页码:2129 / 2138
页数:9
相关论文
共 14 条
  • [1] Chow C.K., Liu C.N., Approximating discrete probability distributions with dependence trees, IEEE Transactions on Information Theory, 14, 3, pp. 462-467, (1968)
  • [2] Friedman N., Geiger D., Goldszmidt M., Bayesian network classifiers, Machine Learning, 29, 2-3, pp. 131-161, (1997)
  • [3] Grossman D., Domingos P., Learning Bayesian network classiers by maximizing conditional likelihood, Proceedings of the 21th International Conference on Machine Learning, pp. 361-368, (2004)
  • [4] Jing Y.S., Pavlovic V., Rehg J.M., Boosted Bayesian network classifiers, Machine Learning, 73, 2, pp. 155-184, (2008)
  • [5] Webb G.I., Boughton J.R., Zheng F., Et al., Learning by extrapolation from marginal to full-multivariate probability distributions: Decreasingly naive Bayesian classification, Machine Learning, 86, 2, pp. 233-272, (2012)
  • [6] John G.H., Langley P., Estimating continuous distributions in Bayesian classifiers, Proceedings of the 11th Conference on Uncertainty in Artificial Intelligence (UAI-1995), pp. 338-345, (1995)
  • [7] Perez A., Larranaga P., Inza I., Supervised classification with conditional Gaussian networks: Increasing the structurecomplexity from naïve Bayes, International Journal of Approximate Reasoning, 43, 1, pp. 1-25, (2006)
  • [8] Perez A., Larranga P., Inza I., Bayesian classifiers based on kernel density estimation: Flexible classifiers, International Journal of Approximate Reasoning, 50, 2, pp. 341-362, (2009)
  • [9] Huang S.C., Using Gaussian process based kernel classifiers for credit rating forecasting, Expert Systems with Applications, 38, 7, pp. 8607-8611, (2011)
  • [10] Li X.-S., Guo C.-X., Guo Y.-H., The credit scoring model on extended tree augment naive Bayesian network, Systems Engineering Theory & Practice, 28, 6, pp. 129-136, (2008)