One-step Low-Rank Representation for Clustering

被引:5
|
作者
Fu, Zhiqiang [1 ,2 ]
Zhao, Yao [1 ,2 ]
Chang, Dongxia [1 ,2 ]
Wang, Yiming [1 ]
Wen, Jie [3 ]
Zhang, Xingxing [4 ]
Guo, Guodong [5 ]
机构
[1] Beijing Jiaotong Univ, Inst Informat Sci, Beijing, Peoples R China
[2] Beijing Key Lab Adv Informat Sci & Network Techno, Beijing, Peoples R China
[3] Harbin Inst Technol, Shenzhen Key Lab Visual Object Detect & Recognit, Shenzhen, Peoples R China
[4] Tsinghua Univ, Dept Comp Sci & Technol, Beijing, Peoples R China
[5] Baidu Res, Beijing, Peoples R China
基金
国家重点研发计划; 中国博士后科学基金;
关键词
Low-rank representation; data clustering; affinity matrix; subspace learning; NONNEGATIVE LOW-RANK; GRAPH; SPARSE;
D O I
10.1145/3503161.3548293
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Existing low-rank representation-based methods adopt a two-step framework, which must employ an extra clustering method to gain labels after representation learning. In this paper, a novel one-step representation-based method, i.e., One-step Low-Rank Representation (OLRR), is proposed to capture multi-subspace structures for clustering. OLRR integrates the low-rank representation model and clustering into a unified framework. Thus it can jointly learn the low-rank subspace structure embedded in the database and gain the clustering results. In particular, by approximating the representation matrix with two same clustering indicator matrices, OLRR can directly show the probability of samples belonging to each cluster. Further, a probability penalty is introduced to ensure that the samples with smaller distances are more inclined to be in the same cluster, thus enhancing the discrimination of the clustering indicator matrix and resulting in a more favorable clustering performance. Moreover, to enhance the robustness against noise, OLRR uses the probability to guide denoising and then performs representation learning and clustering in a recovered clean space. Extensive experiments well demonstrate the robustness and effectiveness of OLRR. Our code is publicly available at:https://github.com/fuzhiqiang1230/OLRR.
引用
收藏
页码:2220 / 2228
页数:9
相关论文
共 50 条
  • [1] One-Step Robust Low-Rank Subspace Segmentation for Tumor Sample Clustering
    Liu, Jian
    Cheng, Yuhu
    Wang, Xuesong
    Ge, Shuguang
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2021, 2021
  • [2] One-step incomplete multiview clustering with low-rank tensor graph learning
    Ji, Guangyan
    Lu, Gui-Fu
    INFORMATION SCIENCES, 2022, 615 : 209 - 225
  • [3] Consensus One-Step Multi-view Image Clustering Based on Low-Rank Tensor Learning
    Li, Lin
    Zhou, Xiaojun
    Lu, Zhiqiang
    Li, Dongxiao
    Zhou, Xiaoxiao
    Song, Li
    Wu, Na
    2022 3RD INFORMATION COMMUNICATION TECHNOLOGIES CONFERENCE (ICTC 2022), 2022, : 117 - 121
  • [4] Tensor Low-Rank Graph Embedding and Learning for One-Step Incomplete Multi-View Clustering
    Wan, Minghua
    Zhu, Jingyu
    Sun, Chengli
    Yang, Zhangjing
    Yin, Jun
    Yang, Guowei
    IEEE TRANSACTIONS ON MULTIMEDIA, 2024, 26 : 9763 - 9775
  • [5] Low-rank one-step wave extrapolation for reverse time migration
    Sun, Junzhe
    Fomel, Sergey
    Ying, Lexing
    GEOPHYSICS, 2016, 81 (01) : S39 - S54
  • [6] Discriminative Low-Rank Representation for HSI Clustering
    Li, Zhixin
    Han, Bo
    Jia, Yuheng
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2024, 21
  • [7] Symmetric low-rank representation for subspace clustering
    Chen, Jie
    Zhang, Haixian
    Mao, Hua
    Sang, Yongsheng
    Yi, Zhang
    NEUROCOMPUTING, 2016, 173 : 1192 - 1202
  • [8] Graph regularized low-rank representation for submodule clustering
    Wu, Tong
    PATTERN RECOGNITION, 2020, 100
  • [9] Multiview Subspace Clustering Using Low-Rank Representation
    Chen, Jie
    Yang, Shengxiang
    Mao, Hua
    Fahy, Conor
    IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (11) : 12364 - 12378
  • [10] Robust discriminant low-rank representation for subspace clustering
    Zhao, Xian
    An, Gaoyun
    Cen, Yigang
    Wang, Hengyou
    Zhao, Ruizhen
    SOFT COMPUTING, 2019, 23 (16) : 7005 - 7013