Fast subspace clustering by learning projective block diagonal representation *

被引:23
作者
Xu, Yesong [1 ,2 ,3 ]
Chen, Shuo [2 ,3 ,4 ]
Li, Jun [2 ,3 ]
Xu, Chunyan [2 ,3 ]
Yang, Jian [2 ,3 ]
机构
[1] Anhui Polytech Univ, Sch Comp & Informat, Wuhu 241000, Peoples R China
[2] Nanjing Univ Sci & Technol, Sch Comp Sci & Engn, PCA Lab, Key Lab Intelligent Percept & Syst High Dimens Inf, Nanjing 210094, Jiangsu, Peoples R China
[3] Nanjing Univ Sci & Technol, Sch Comp Sci & Engn, Jiangsu Key Lab Image & Video Understanding Social, Nanjing 210094, Jiangsu, Peoples R China
[4] RIKEN, Ctr Adv Intelligence Project, Tokyo 1030027, Japan
基金
中国博士后科学基金; 美国国家科学基金会;
关键词
Subspace clustering; Block diagonal representation; Large-scale data; SEGMENTATION; ROBUST;
D O I
10.1016/j.patcog.2022.109152
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Block Diagonal Representation (BDR) has attracted massive attention in subspace clustering, yet the high computational cost limits its widespread application. To address this issue, we propose a novel approach called Projective Block Diagonal Representation (PBDR), which rapidly pursues a representation matrix with the block diagonal structure. Firstly, an effective sampling strategy is utilized to select a small subset of the original large-scale data. Then, we learn a projection mapping to match the block diagonal representation matrix on the selected subset. After training, we employ the learned projection mapping to quickly generate the representation matrix with an ideal block diagonal structure for the original largescale data. Additionally, we further extend the proposed PBDR model ( i.e. , PBDR c 1 and PBDR *) by capturing the global or local structure of the data to enhance block diagonal coding capability. This paper also proves the effectiveness of the proposed model theoretically. Especially, this is the first work to directly learn a representation matrix with a block diagonal structure to handle the large-scale subspace clustering problem. Finally, experimental results on publicly available datasets show that our approaches achieve faster and more accurate clustering results compared to the state-of-the-art block diagonal-based subspace clustering approaches, which demonstrates its practical usefulness. (c) 2022 Elsevier Ltd. All rights reserved.
引用
收藏
页数:13
相关论文
共 66 条
  • [1] [Anonymous], Rethinking the inception architecture for computer vision
  • [2] Deep self-representative subspace clustering network
    Baek, Sangwon
    Yoon, Gangjoon
    Song, Jinjoo
    Yoon, Sang Min
    [J]. PATTERN RECOGNITION, 2021, 118
  • [3] Bishop C. M., 2007, Pattern Recognition and Machine Learning Information Science and Statistics, V1st
  • [4] Invariant Scattering Convolution Networks
    Bruna, Joan
    Mallat, Stephane
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2013, 35 (08) : 1872 - 1886
  • [5] Large Scale Spectral Clustering Via Landmark-Based Sparse Representation
    Cai, Deng
    Chen, Xinlei
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2015, 45 (08) : 1669 - 1680
  • [6] Chen M., ARXIV, DOI DOI 10.48550/ARXIV.1009.5055
  • [7] Chen S, 2021, PR MACH LEARN RES, V139
  • [8] Chen S, 2018, PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, P2021
  • [9] Low-Rank Latent Pattern Approximation With Applications to Robust Image Classification
    Chen, Shuo
    Yang, Jian
    Luo, Lei
    Wei, Yang
    Zhang, Kaihua
    Tai, Ying
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2017, 26 (11) : 5519 - 5530
  • [10] Parallel Spectral Clustering in Distributed Systems
    Chen, Wen-Yen
    Song, Yangqiu
    Bai, Hongjie
    Lin, Chih-Jen
    Chang, Edward Y.
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2011, 33 (03) : 568 - 586