Adaptive multi-granularity sparse subspace clustering

被引:12
|
作者
Deng, Tingquan [1 ]
Yang, Ge [1 ]
Huang, Yang [1 ]
Yang, Ming [1 ]
Fujita, Hamido [2 ,3 ,4 ]
机构
[1] Harbin Engn Univ, Coll Math Sci, Harbin 150001, Peoples R China
[2] Univ Teknol Malaysia, Malaysia Japan Int Inst Technol MJIIT, Kuala Lumpur 54100, Malaysia
[3] Univ Granada, Andalusian Res Inst Data Sci & Computat Intelligen, Granada, Spain
[4] Iwate Prefectural Univ, Reg Res Ctr, Takizawa 0200693, Japan
基金
中国国家自然科学基金;
关键词
Sparse subspace clustering; Sparse representation; Scored nearest neighborhood; Granular computing; Multi-granularity; LOW-RANK REPRESENTATION; DIMENSIONALITY REDUCTION; ROBUST; MATRIX; MODELS; SEGMENTATION; ALGORITHM;
D O I
10.1016/j.ins.2023.119143
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Sparse subspace clustering (SSC) focuses on revealing data distribution from algebraic perspectives and has been widely applied to high-dimensional data. The key to SSC is to learn the sparsest representation and derive an adjacency graph. Theoretically, the adjacency matrix with proper block diagonal structure leads to a desired clustering result. Various generalizations have been made through imposing Laplacian regularization or locally linear embedding to describe the manifold structure based on the nearest neighborhoods of samples. However, a single set of nearest neighborhoods cannot effectively characterize local information. From the perspective of granular computing, the notion of scored nearest neighborhoods is introduced to develop multi-granularity neighborhoods of samples. The multi-granularity representation of samples is integrated with SSC to collaboratively learn the sparse representation, and an adaptive multi-granularity sparse subspace clustering model (AMGSSC) is proposed. The learned adjacency matrix has a consistent block diagonal structure at all granularity levels. Furthermore, the locally linear relationship between samples is embedded in AMGSSC, and an enhanced AMGLSSC is developed to eliminate the over-sparsity of the learned adjacency graph. Experimental results show the superior performance of both models on several clustering criteria compared with state-of-the-art subspace clustering methods.
引用
收藏
页数:26
相关论文
共 50 条
  • [41] MULTI-GRANULARITY KNOWLEDGE MINING ON THE WEB
    Xie, Ming
    INTERNATIONAL JOURNAL OF SOFTWARE ENGINEERING AND KNOWLEDGE ENGINEERING, 2012, 22 (01) : 1 - 16
  • [42] MGCC: Multi-Granularity Cognitive Computing
    Wang, Guoyin
    ROUGH SETS, IJCRS 2022, 2022, 13633 : 30 - 38
  • [43] Accelerator for multi-granularity attribute reduction
    Jiang, Zehua
    Yang, Xibei
    Yu, Hualong
    Liu, Dun
    Wang, Pingxin
    Qian, Yuhua
    KNOWLEDGE-BASED SYSTEMS, 2019, 177 : 145 - 158
  • [44] Multi-granularity Intelligent Information Processing
    Wang, Guoyin
    Xu, Ji
    Zhang, Qinghua
    Liu, Yuchao
    ROUGH SETS, FUZZY SETS, DATA MINING, AND GRANULAR COMPUTING, RSFDGRC 2015, 2015, 9437 : 36 - 48
  • [45] Multi-granularity visual explanations for CNN
    Bao, Huanan
    Wang, Guoyin
    Li, Shuai
    Liu, Qun
    KNOWLEDGE-BASED SYSTEMS, 2022, 253
  • [46] The construction of multi-granularity concept lattices
    Hu, Qian
    Qin, Ke-Yun
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2020, 39 (03) : 2783 - 2790
  • [47] AN ACCELERATED GRADIENT METHOD FOR NONCONVEX SPARSE SUBSPACE CLUSTERING PROBLEM
    Li, Hongwu
    Zhang, Haibin
    Xiao, Yunhai
    PACIFIC JOURNAL OF OPTIMIZATION, 2022, 18 (02): : 265 - 280
  • [48] Low-rank sparse subspace clustering with a clean dictionary
    You, Cong-Zhe
    Shu, Zhen-Qiu
    Fan, Hong-Hui
    JOURNAL OF ALGORITHMS & COMPUTATIONAL TECHNOLOGY, 2021, 15
  • [49] LRSR: Low-Rank-Sparse representation for subspace clustering
    Wang, Jun
    Shi, Daming
    Cheng, Dansong
    Zhang, Yongqiang
    Gao, Junbin
    NEUROCOMPUTING, 2016, 214 : 1026 - 1037
  • [50] Latent Space Sparse and Low-Rank Subspace Clustering
    Patel, Vishal M.
    Hien Van Nguyen
    Vidal, Rene
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2015, 9 (04) : 691 - 701