A hierarchical clustering based on mutual information maximization

被引:0
|
作者
Aghagolzadeh, M. [1 ]
Soltanian-Zadeh, H. [1 ,3 ]
Araabi, B. [1 ]
Aghagolzadeh, A. [2 ]
机构
[1] Univ Tehran, Control & Intelligent Proc Ctr Excellence, Dept Elect & Comp Engn, Tehran 14395515, Iran
[2] Univ Tabriz, Fac Elect & Comp Engn, Tabriz 51664, Iran
[3] Henry Ford Hlth Syst, Radiol Image Analyt Lab, Detroit, MI 48202 USA
来源
2007 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, VOLS 1-7 | 2007年
关键词
agglomerative hierarchical clustering; information potential; mutual information (MI); Renyi's entropy;
D O I
暂无
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Mutual information has been used in many clustering algorithms for measuring general dependencies between random data variables, but its difficulties in computing for small size datasets has limited its efficiency for clustering in many applications. A novel clustering method is proposed which estimates mutual information based on information potential computed pair-wise between data points and without any prior assumptions about cluster density function. The proposed algorithm increases the mutual information in each step in an agglomerative hierarchy scheme. We have shown experimentally that maximizing mutual information between data points and their class labels will lead to an efficient clustering. Experiments done on a variety of artificial and real datasets show the superiority of this algorithm, besides its low computational complexity, in comparison to other information based clustering methods and also some ordinary clustering algorithms.
引用
收藏
页码:277 / +
页数:2
相关论文
共 50 条
  • [1] Deep node clustering based on mutual information maximization
    Molaei, Soheila
    Bousejin, Nima Ghanbari
    Zare, Hadi
    Jalili, Mahdi
    NEUROCOMPUTING, 2021, 455 : 274 - 282
  • [2] Information-Maximization Clustering Based on Squared-Loss Mutual Information
    Sugiyama, Masashi
    Niu, Gang
    Yamada, Makoto
    Kimura, Manabu
    Hachiya, Hirotaka
    NEURAL COMPUTATION, 2014, 26 (01) : 84 - 131
  • [3] Hierarchical clustering using mutual information
    Kraskov, A
    Stögbauer, H
    Andrzejak, RG
    Grassberger, P
    EUROPHYSICS LETTERS, 2005, 70 (02): : 278 - 284
  • [4] Agglomerative hierarchical clustering of continuous variables based on mutual information
    Kojadinovic, I
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2004, 46 (02) : 269 - 294
  • [5] Learning Deep Generative Clustering via Mutual Information Maximization
    Yang, Xiaojiang
    Yan, Junchi
    Cheng, Yu
    Zhang, Yizhe
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (09) : 6263 - 6275
  • [6] Variational Deep Embedding Clustering by Augmented Mutual Information Maximization
    Ji, Qiang
    Sun, Yanfeng
    Hu, Yongli
    Yin, Baocai
    2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 2196 - 2202
  • [7] Word sense induction with agglomerative clustering and mutual information maximization
    Abdine, Hadi
    Eddine, Moussa Kamal
    Buscaldi, Davide
    Vazirgiannis, Michalis
    AI OPEN, 2023, 4 : 193 - 201
  • [8] Signal estimation based on mutual information maximization
    Rohde, G. K.
    Nichols, J.
    Bucholtz, F.
    Michalowicz, J. V.
    CONFERENCE RECORD OF THE FORTY-FIRST ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS, VOLS 1-5, 2007, : 597 - +
  • [9] Deep graph clustering via mutual information maximization and mixture model
    Ahmadi, Maedeh
    Safayani, Mehran
    Mirzaei, Abdolreza
    KNOWLEDGE AND INFORMATION SYSTEMS, 2024, 66 (08) : 4549 - 4572
  • [10] Dependence-Maximization Clustering with Least-Squares Mutual Information
    Kimura, Manabu
    Sugiyama, Masashi
    JOURNAL OF ADVANCED COMPUTATIONAL INTELLIGENCE AND INTELLIGENT INFORMATICS, 2011, 15 (07) : 800 - 805