Generalized Competitive Learning of Gaussian Mixture Models

被引:14
|
作者
Lu, Zhiwu [1 ]
Ip, Horace H. S. [1 ,2 ]
机构
[1] City Univ Hong Kong, Dept Comp Sci, Kowloon, Hong Kong, Peoples R China
[2] City Univ Hong Kong, Ctr Innovat Applicat Internet & Multimedia Techno, AIMtech Ctr, Kowloon, Hong Kong, Peoples R China
来源
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS | 2009年 / 39卷 / 04期
关键词
Clustering analysis; competitive learning; Gaussian mixture; model selection; regularization theory; SEGMENTATION; ALGORITHM; LENGTH;
D O I
10.1109/TSMCB.2008.2012119
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
When fitting Gaussian mixtures to multivariate data, it is crucial to select the appropriate number of Gaussians, which is generally referred to as the model selection problem. Under regularization theory, we aim to solve this model selection problem through developing an entropy regularized likelihood (ERL) learning on Gaussian mixtures. We further present a gradient algorithm for this ERL learning. Through some theoretic analysis, we have shown a mechanism of generalized competitive learning that is inherent in the ERL learning, which can lead to automatic model selection on Gaussian mixtures and also make our ERL learning algorithm less sensitive to the initialization as compared to the standard expectation-maximization algorithm. The experiments on simulated data using our algorithm verified our theoretic analysis. Moreover, our ERL learning algorithm has been shown to outperform other competitive learning algorithms in the application of unsupervised image segmentation.
引用
收藏
页码:901 / 909
页数:9
相关论文
共 50 条
  • [41] Deep Gaussian mixture models
    Cinzia Viroli
    Geoffrey J. McLachlan
    Statistics and Computing, 2019, 29 : 43 - 51
  • [42] Fuzzy Gaussian Mixture Models
    Ju, Zhaojie
    Liu, Honghai
    PATTERN RECOGNITION, 2012, 45 (03) : 1146 - 1158
  • [43] Parsimonious Gaussian mixture models
    McNicholas, Paul David
    Murphy, Thomas Brendan
    STATISTICS AND COMPUTING, 2008, 18 (03) : 285 - 296
  • [44] Combining Gaussian mixture models
    Lee, HJ
    Cho, S
    INTELLIGENT DATA ENGINEERING AND AUTOMATED LEARNING IDEAL 2004, PROCEEDINGS, 2004, 3177 : 666 - 671
  • [45] Deep Gaussian mixture models
    Viroli, Cinzia
    McLachlan, Geoffrey J.
    STATISTICS AND COMPUTING, 2019, 29 (01) : 43 - 51
  • [46] Parsimonious Gaussian mixture models
    Paul David McNicholas
    Thomas Brendan Murphy
    Statistics and Computing, 2008, 18 : 285 - 296
  • [47] ADAPTED STATISTICAL COMPRESSIVE SENSING: LEARNING TO SENSE GAUSSIAN MIXTURE MODELS
    Duarte-Carvajalino, Julio M.
    Yu, Guoshen
    Carin, Lawrence
    Sapiro, Guillermo
    2012 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2012, : 3653 - 3656
  • [48] Genetic-based EM algorithm for learning Gaussian mixture models
    Pernkopf, F
    Bouchaffra, D
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2005, 27 (08) : 1344 - 1348
  • [49] Unsupervised Point Cloud Registration by Learning Unified Gaussian Mixture Models
    Huang, Xiaoshui
    Li, Sheng
    Zuo, Yifan
    Fang, Yuming
    Zhang, Jian
    Zhao, Xiaowei
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2022, 7 (03) : 7028 - 7035
  • [50] Meta-learning representations for clustering with infinite Gaussian mixture models
    Iwata, Tomoharu
    NEUROCOMPUTING, 2023, 549