Representation learning for clustering via building consensus

被引:2
|
作者
Deshmukh, Aniket Anand [1 ]
Regatti, Jayanth Reddy [2 ]
Manavoglu, Eren [1 ]
Dogan, Urun [1 ]
机构
[1] Microsoft, Mountain View, CA 94043 USA
[2] Ohio State Univ, Columbus, OH 43210 USA
关键词
Unsupervised learning; Clustering; Representation learning; Consensus clustering;
D O I
10.1007/s10994-022-06194-9
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we focus on unsupervised representation learning for clustering of images. Recent advances in deep clustering and unsupervised representation learning are based on the idea that different views of an input image (generated through data augmentation techniques) must be close in the representation space (exemplar consistency), and/or similar images must have similar cluster assignments (population consistency). We define an additional notion of consistency, consensus consistency, which ensures that representations are learned to induce similar partitions for variations in the representation space, different clustering algorithms or different initializations of a single clustering algorithm. We define a clustering loss by executing variations in the representation space and seamlessly integrate all three consistencies (consensus, exemplar and population) into an end-to-end learning framework. The proposed algorithm, consensus clustering using unsupervised representation learning (ConCURL), improves upon the clustering performance of state-of-the-art methods on four out of five image datasets. Furthermore, we extend the evaluation procedure for clustering to reflect the challenges encountered in real-world clustering tasks, such as maintaining clustering performance in cases with distribution shifts. We also perform a detailed ablation study for a deeper understanding of the proposed algorithm. The code and the trained models are available at haps://github.com/JayanthRR/ConCURL_NCE.
引用
收藏
页码:4601 / 4638
页数:38
相关论文
共 50 条
  • [1] Representation learning for clustering via building consensus
    Aniket Anand Deshmukh
    Jayanth Reddy Regatti
    Eren Manavoglu
    Urun Dogan
    Machine Learning, 2022, 111 : 4601 - 4638
  • [2] Consensus Clustering With Unsupervised Representation Learning
    Regatti, Jayanth Reddy
    Deshmukh, Aniket Anand
    Manavoglu, Eren
    Dogan, Urun
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [3] Learning an enhanced consensus representation for multi-view clustering via latent representation correlation preserving
    Gui, Zhongyan
    Yang, Jing
    Xie, Zhiqiang
    Knowledge-Based Systems, 2022, 253
  • [4] Learning an enhanced consensus representation for multi-view clustering via latent representation correlation preserving
    Gui, Zhongyan
    Yang, Jing
    Xie, Zhiqiang
    KNOWLEDGE-BASED SYSTEMS, 2022, 253
  • [5] Salient and consensus representation learning based incomplete multiview clustering
    Zhao, Shuping
    Cui, Zhongwei
    Wu, Lian
    Xu, Yong
    Zuo, Yu
    Fei, Lunke
    APPLIED INTELLIGENCE, 2023, 53 (03) : 2723 - 2737
  • [6] Salient and consensus representation learning based incomplete multiview clustering
    Shuping Zhao
    Zhongwei Cui
    Lian Wu
    Yong Xu
    Yu Zuo
    Lunke Fei
    Applied Intelligence, 2023, 53 : 2723 - 2737
  • [7] Comprehensive consensus representation learning for incomplete multiview subspace clustering
    Guo, Xiaoxing
    Lu, Gui-Fu
    INFORMATION SCIENCES, 2024, 678
  • [8] Trajectory Clustering via Deep Representation Learning
    Yao, Di
    Zhang, Chao
    Zhu, Zhihua
    Huang, Jianhui
    Bi, Jingping
    2017 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2017, : 3880 - 3887
  • [9] Consensus Affinity Graph Learning via Structure Graph Fusion and Block Diagonal Representation for Multiview Clustering
    Gui, Zhongyan
    Yang, Jing
    Xie, Zhiqiang
    Ye, Cuicui
    NEURAL PROCESSING LETTERS, 2024, 56 (02)
  • [10] Fast and interpretable consensus clustering via minipatch learning
    Gan, Luqin
    Allen, Genevera, I
    PLOS COMPUTATIONAL BIOLOGY, 2022, 18 (10)