Generalized Competitive Learning of Gaussian Mixture Models

被引:14
|
作者
Lu, Zhiwu [1 ]
Ip, Horace H. S. [1 ,2 ]
机构
[1] City Univ Hong Kong, Dept Comp Sci, Kowloon, Hong Kong, Peoples R China
[2] City Univ Hong Kong, Ctr Innovat Applicat Internet & Multimedia Techno, AIMtech Ctr, Kowloon, Hong Kong, Peoples R China
来源
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS | 2009年 / 39卷 / 04期
关键词
Clustering analysis; competitive learning; Gaussian mixture; model selection; regularization theory; SEGMENTATION; ALGORITHM; LENGTH;
D O I
10.1109/TSMCB.2008.2012119
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
When fitting Gaussian mixtures to multivariate data, it is crucial to select the appropriate number of Gaussians, which is generally referred to as the model selection problem. Under regularization theory, we aim to solve this model selection problem through developing an entropy regularized likelihood (ERL) learning on Gaussian mixtures. We further present a gradient algorithm for this ERL learning. Through some theoretic analysis, we have shown a mechanism of generalized competitive learning that is inherent in the ERL learning, which can lead to automatic model selection on Gaussian mixtures and also make our ERL learning algorithm less sensitive to the initialization as compared to the standard expectation-maximization algorithm. The experiments on simulated data using our algorithm verified our theoretic analysis. Moreover, our ERL learning algorithm has been shown to outperform other competitive learning algorithms in the application of unsupervised image segmentation.
引用
收藏
页码:901 / 909
页数:9
相关论文
共 50 条
  • [21] Learning Gaussian mixture models by structural risk minimization
    Wang, LW
    Feng, JF
    Proceedings of 2005 International Conference on Machine Learning and Cybernetics, Vols 1-9, 2005, : 4858 - 4863
  • [22] On the Role of Channel Capacity in Learning Gaussian Mixture Models
    Romanov, Elad
    Bendory, Tamir
    Ordentlich, Or
    CONFERENCE ON LEARNING THEORY, VOL 178, 2022, 178
  • [23] Sliced Wasserstein Distance for Learning Gaussian Mixture Models
    Kolouri, Soheil
    Rohde, Gustavo K.
    Hoffmann, Heiko
    2018 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2018, : 3427 - 3436
  • [24] Multivariate Regression with Incremental Learning of Gaussian Mixture Models
    Acevedo-Valle, Juan M.
    Trejo, Karla
    Angulo, Cecilio
    RECENT ADVANCES IN ARTIFICIAL INTELLIGENCE RESEARCH AND DEVELOPMENT, 2017, 300 : 196 - 205
  • [25] Unsupervised learning of finite full covariance multivariate generalized Gaussian mixture models for human activity recognition
    Najar, Fatma
    Bourouis, Sami
    Bouguila, Nizar
    Belghith, Safya
    MULTIMEDIA TOOLS AND APPLICATIONS, 2019, 78 (13) : 18669 - 18691
  • [26] Unsupervised learning of finite full covariance multivariate generalized Gaussian mixture models for human activity recognition
    Fatma Najar
    Sami Bourouis
    Nizar Bouguila
    Safya Belghith
    Multimedia Tools and Applications, 2019, 78 : 18669 - 18691
  • [27] Variational Inference of Infinite Generalized Gaussian Mixture Models with Feature Selection
    Amudala, Srikanth
    Ali, Samr
    Bouguila, Nizar
    2020 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2020, : 120 - 127
  • [28] Supervised learning of Gaussian mixture models for visual vocabulary generation
    Fernando, Basura
    Fromont, Elisa
    Muselet, Damien
    Sebban, Marc
    PATTERN RECOGNITION, 2012, 45 (02) : 897 - 907
  • [29] Learning Diagonal Gaussian Mixture Models and Incomplete Tensor Decompositions
    Bingni Guo
    Jiawang Nie
    Zi Yang
    Vietnam Journal of Mathematics, 2022, 50 : 421 - 446
  • [30] A fast globally supervised learning algorithm for Gaussian Mixture Models
    Ma, JY
    Gao, W
    WEB-AGE INFORMATION MANAGEMENT, PROCEEDINGS, 2000, 1846 : 449 - 454