Generalized Competitive Learning of Gaussian Mixture Models

被引:14
|
作者
Lu, Zhiwu [1 ]
Ip, Horace H. S. [1 ,2 ]
机构
[1] City Univ Hong Kong, Dept Comp Sci, Kowloon, Hong Kong, Peoples R China
[2] City Univ Hong Kong, Ctr Innovat Applicat Internet & Multimedia Techno, AIMtech Ctr, Kowloon, Hong Kong, Peoples R China
来源
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS | 2009年 / 39卷 / 04期
关键词
Clustering analysis; competitive learning; Gaussian mixture; model selection; regularization theory; SEGMENTATION; ALGORITHM; LENGTH;
D O I
10.1109/TSMCB.2008.2012119
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
When fitting Gaussian mixtures to multivariate data, it is crucial to select the appropriate number of Gaussians, which is generally referred to as the model selection problem. Under regularization theory, we aim to solve this model selection problem through developing an entropy regularized likelihood (ERL) learning on Gaussian mixtures. We further present a gradient algorithm for this ERL learning. Through some theoretic analysis, we have shown a mechanism of generalized competitive learning that is inherent in the ERL learning, which can lead to automatic model selection on Gaussian mixtures and also make our ERL learning algorithm less sensitive to the initialization as compared to the standard expectation-maximization algorithm. The experiments on simulated data using our algorithm verified our theoretic analysis. Moreover, our ERL learning algorithm has been shown to outperform other competitive learning algorithms in the application of unsupervised image segmentation.
引用
收藏
页码:901 / 909
页数:9
相关论文
共 50 条
  • [31] Polynomial Time and Private Learning of Unbounded Gaussian Mixture Models
    Arbas, Jamil
    Ashtiani, Hassan
    Liaw, Christopher
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 202, 2023, 202
  • [32] Augmenting machine learning photometric redshifts with Gaussian mixture models
    Hatfield, P. W.
    Almosallam, I. A.
    Jarvis, M. J.
    Adams, N.
    Bowler, R. A. A.
    Gomes, Z.
    Roberts, S. J.
    Schreiber, C.
    MONTHLY NOTICES OF THE ROYAL ASTRONOMICAL SOCIETY, 2020, 498 (04) : 5498 - 5510
  • [33] Learning Diagonal Gaussian Mixture Models and Incomplete Tensor Decompositions
    Guo, Bingni
    Nie, Jiawang
    Yang, Zi
    VIETNAM JOURNAL OF MATHEMATICS, 2022, 50 (02) : 421 - 446
  • [34] Gaussian Mixture Models for Affordance Learning using Bayesian Networks
    Osorio, Pedro
    Bernardino, Alexandre
    Martinez-Cantin, Ruben
    Santos-Victor, Jose
    IEEE/RSJ 2010 INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS 2010), 2010,
  • [35] Learning Gaussian Mixture Models With Entropy-Based Criteria
    Penalver Benavent, Antonio
    Escolano Ruiz, Francisco
    Manuel Saez, Juan
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2009, 20 (11): : 1756 - 1771
  • [36] Learning Stable Nonlinear Dynamical Systems With Gaussian Mixture Models
    Khansari-Zadeh, S. Mohammad
    Billard, Aude
    IEEE TRANSACTIONS ON ROBOTICS, 2011, 27 (05) : 943 - 957
  • [37] Learning Physical Properties of Objects Using Gaussian Mixture Models
    Hassani, Kaveh
    Lee, Won-Sook
    ADVANCES IN ARTIFICIAL INTELLIGENCE, CANADIAN AI 2017, 2017, 10233 : 179 - 190
  • [38] Generalized linear mixed models with Gaussian mixture random effects: Inference and application
    Pan, Lanfeng
    Li, Yehua
    He, Kevin
    Li, Yanming
    Li, Yi
    JOURNAL OF MULTIVARIATE ANALYSIS, 2020, 175
  • [39] Bounded generalized Gaussian mixture model
    Thanh Minh Nguyen
    Wu, Q. M. Jonathan
    Zhang, Hui
    PATTERN RECOGNITION, 2014, 47 (09) : 3132 - 3142
  • [40] Gaussian Mixture Models in R
    Chassagnol, Bastien
    Bichat, Antoine
    Boudjeniba, Cheima
    Wuillemin, Pierre-Henri
    Guedj, Mickael
    Gohel, David
    Nuel, Gregory
    Becht, Etienne
    R JOURNAL, 2023, 15 (02): : 56 - 76