Contrastive self-representation learning for data clustering

被引:8
作者
Zhao, Wenhui [1 ]
Gao, Quanxue [1 ]
Mei, Shikun [1 ]
Yang, Ming [2 ]
机构
[1] Xidian Univ, Sch Telecommun Engn, Xian 710071, Shaanxi, Peoples R China
[2] Harbin Engn Univ, Coll Math Sci, Harbin 150001, Peoples R China
基金
中国国家自然科学基金;
关键词
Self-representation; Contrastive learning; Subspace clustering; PRINCIPAL COMPONENT ANALYSIS; LOW-RANK; SPARSE; ROBUST; GRAPH;
D O I
10.1016/j.neunet.2023.08.050
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper is concerned with self-representation subspace learning. It is one of the most representative subspace techniques, which has attracted considerable attention for clustering due to its good performance. Among these methods, low-rank representation (LRR) has achieved impressive results for subspace clustering. However, it only considers the similarity between the data itself, while neglecting the differences with other samples. Besides, it cannot well deal with noise and portray cluster-to -cluster relationships well. To solve these problems, we propose a Contrastive Self-representation model for Clustering (CSC). CSC simultaneously takes into account the similarity/dissimilarity between positive/negative pairs when learning the self-representation coefficient matrix of data while the form of the loss function can reduce the effect of noise on the results. Moreover, We use the l1,2- norm regularizer on the coefficient matrix to achieve its sparsity to better characterize the cluster structure. Thus, the learned self-representation coefficient matrix well encodes both the discriminative information and cluster structure. Extensive experiments on seven benchmark databases indicate the superiority of our proposed method.(c) 2023 Elsevier Ltd. All rights reserved.
引用
收藏
页码:648 / 655
页数:8
相关论文
共 46 条
  • [1] Asuncion A., 2007, UCI MACHINE LEARNING
  • [2] Deep self-representative subspace clustering network
    Baek, Sangwon
    Yoon, Gangjoon
    Song, Jinjoo
    Yoon, Sang Min
    [J]. PATTERN RECOGNITION, 2021, 118
  • [3] Cai D., 2010, P 16 ACM SIGKDD INT, P333, DOI DOI 10.1145/1835804.1835848
  • [4] Chen T, 2020, PR MACH LEARN RES, V119
  • [5] Cross-view classification by joint adversarial learning and class-specificity distribution
    Deng, Siyang
    Xia, Wei
    Gao, Quanxue
    Gao, Xinbo
    [J]. PATTERN RECOGNITION, 2021, 110
  • [6] Elhamifar Ehsan, 2009, 2009 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), P2790, DOI 10.1109/CVPRW.2009.5206547
  • [7] Sparse Subspace Clustering: Algorithm, Theory, and Applications
    Elhamifar, Ehsan
    Vidal, Rene
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2013, 35 (11) : 2765 - 2781
  • [8] Guo XF, 2017, PROCEEDINGS OF THE TWENTY-SIXTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, P1753
  • [9] Joint Embedding Learning and Sparse Regression: A Framework for Unsupervised Feature Selection
    Hou, Chenping
    Nie, Feiping
    Li, Xuelong
    Yi, Dongyun
    Wu, Yi
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2014, 44 (06) : 793 - 804
  • [10] Smooth Representation Clustering
    Hu, Han
    Lin, Zhouchen
    Feng, Jianjiang
    Zhou, Jie
    [J]. 2014 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2014, : 3834 - 3841