Entropy Regularized Likelihood Learning on Gaussian Mixture: Two Gradient Implementations for Automatic Model Selection

被引:0
作者
Zhiwu Lu
机构
[1] Peking University,Institute of Computer Science and Technology
来源
Neural Processing Letters | 2007年 / 25卷
关键词
competitive learning; Gaussian mixture; model selection; regularization theory;
D O I
暂无
中图分类号
学科分类号
摘要
In Gaussian mixture modeling, it is crucial to select the number of Gaussians or mixture model for a sample data set. Under regularization theory, we aim to solve this kind of model selection problem through implementing entropy regularized likelihood (ERL) learning on Gaussian mixture via a batch gradient learning algorithm. It is demonstrated by the simulation experiments that this gradient ERL learning algorithm can select an appropriate number of Gaussians automatically during the parameter learning on a sample data set and lead to a good estimation of the parameters in the actual Gaussian mixture, even in the cases of two or more actual Gaussians overlapped strongly. We further give an adaptive gradient implementation of the ERL learning on Gaussian mixture followed with theoretic analysis, and find a mechanism of generalized competitive learning implied in the ERL learning.
引用
收藏
页码:17 / 30
页数:13
相关论文
共 21 条
  • [1] Render R.A.(1984)Mixture densities, maximum likelihood and the EM algorithm SIAM Review 26 195-239
  • [2] Walker H.F.(1974)A new look at the statistical model identification IEEE Transactions on Automatic Control 19 716-723
  • [3] Akaike H.(1978)Estimating the dimension of a model The Annals of Statistics 6 461-464
  • [4] Schwarz G.(1993)Rival penalized competitive learning for clustering analysis, RBF net, and curve detection IEEE Transactions on Neural networks 4 636-648
  • [5] Xu L.(2002)BYY harmony learning, structural RPCL, and topological self-organzing on mixture modes Neural Networks 15 1231-1237
  • [6] Krzyzak A.(2005)A gradient BYY harmony learning rule on Gaussian mixture with automated model selection A gradient BYY harmony learning algorithm on mixture of experts for curve detection, Lecture Notes in Computer Science 3578 250-257
  • [7] Oja E.(2004)Asymptotic analysis of penalized likelihood and related estimators Neurocomputing 56 481-487
  • [8] Xu L.(1990)An overview of statistical learning theory The Annals of Statistics 18 1676-1695
  • [9] Lu Z.(1999)An iterative algorithm for entropy regularized likelihood learning on Gaussian mixture with automatic model selection IEEE Transactions on Neural Networks 10 988-999
  • [10] Cheng Q.(2006)Unsupervised image segmentation using an iterative entropy regularized likelihood learning algorithm Neurocomputing 69 1674-1677