K-BEST subspace clustering: kernel-friendly block-diagonal embedded and similarity-preserving transformed subspace clustering

被引:0
作者
Maggu, Jyoti [1 ]
Goel, Anurag [2 ]
机构
[1] Thapar Inst Engn & Technol, CSED, Patiala 147004, Punjab, India
[2] Delhi Technol Univ, Dept Comp Sci & Engn, New Delhi 110042, India
关键词
Block diagonal representation; Kernel representations; Non-linear subspace clustering; Similarity preserving; Transformed subspace clustering; LOW-RANK REPRESENTATION; MANIFOLD;
D O I
10.1007/s10044-024-01336-2
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Subspace clustering methods, employing sparse and low-rank models, have demonstrated efficacy in clustering high-dimensional data. These approaches typically assume the separability of input data into distinct subspaces, a premise that does not hold true in general. Furthermore, prevalent low-rank and sparse methods relying on self-expression exhibit effectiveness primarily with linear structure data, facing limitations in processing datasets with intricate nonlinear structures. While kernel subspace clustering methods excel in handling nonlinear structures, they may compromise similarity information during the reconstruction of original data in kernel space. Additionally, these methods may fall short of attaining an affinity matrix with an optimal block-diagonal property. In response to these challenges, this paper introduces a novel subspace clustering approach named Similarity Preserving Kernel Block Diagonal Representation based Transformed Subspace Clustering (KBD-TSC). KBD-TSC contributes in three key aspects: (1) integration of a kernelized version of transform learning within a subspace clustering framework, introducing a block diagonal representation term to generate an affinity matrix with a block-diagonal structure. (2) Construction and integration of a similarity preserving regularizer into the model by minimizing the discrepancy between inner products of the original data and those of the reconstructed data in kernel space. This facilitates enhanced preservation of similarity information between the original data points. (3) Proposal of KBD-TSC by integrating the block diagonal representation term and similarity preserving regularizer into a kernel self-expressing model. The optimization of the proposed model is efficiently addressed through the alternating direction method of multipliers. This study validates the effectiveness of the proposed KBD-TSC method through experimental results obtained from nine datasets, showcasing its potential in addressing the limitations of existing subspace clustering techniques.
引用
收藏
页数:13
相关论文
共 77 条
  • [1] [Anonymous], 2016, IJCAI
  • [2] Bai L, 2020, ICML'20
  • [3] Discriminative and coherent subspace clustering
    Chen, Huazhu
    Wang, Weiwei
    Feng, Xiangchu
    He, Ruiqiang
    [J]. NEUROCOMPUTING, 2018, 284 : 177 - 186
  • [4] Symmetric low-rank representation for subspace clustering
    Chen, Jie
    Zhang, Haixian
    Mao, Hua
    Sang, Yongsheng
    Yi, Zhang
    [J]. NEUROCOMPUTING, 2016, 173 : 1192 - 1202
  • [5] Active Orthogonal Matching Pursuit for Sparse Subspace Clustering
    Chen, Yanxi
    Li, Gen
    Gu, Yuantao
    [J]. IEEE SIGNAL PROCESSING LETTERS, 2018, 25 (02) : 164 - 168
  • [6] Locality Constrained-lp Sparse Subspace Clustering for Image Clustering
    Cheng, Wenlong
    Chow, Tommy W. S.
    Zhao, Mingbo
    [J]. NEUROCOMPUTING, 2016, 205 : 22 - 31
  • [7] Sparse Subspace Clustering: Algorithm, Theory, and Applications
    Elhamifar, Ehsan
    Vidal, Rene
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2013, 35 (11) : 2765 - 2781
  • [9] RANDOM SAMPLE CONSENSUS - A PARADIGM FOR MODEL-FITTING WITH APPLICATIONS TO IMAGE-ANALYSIS AND AUTOMATED CARTOGRAPHY
    FISCHLER, MA
    BOLLES, RC
    [J]. COMMUNICATIONS OF THE ACM, 1981, 24 (06) : 381 - 395
  • [10] Sparse subspace clustering incorporated deep convolutional transform learning for hyperspectral band selection
    Goel, Anurag
    Majumdar, Angshul
    [J]. EARTH SCIENCE INFORMATICS, 2024, 17 (03) : 2727 - 2735