Adaptive multi-granularity sparse subspace clustering

被引:15
作者
Deng, Tingquan [1 ]
Yang, Ge [1 ]
Huang, Yang [1 ]
Yang, Ming [1 ]
Fujita, Hamido [2 ,3 ,4 ]
机构
[1] Harbin Engn Univ, Coll Math Sci, Harbin 150001, Peoples R China
[2] Univ Teknol Malaysia, Malaysia Japan Int Inst Technol MJIIT, Kuala Lumpur 54100, Malaysia
[3] Univ Granada, Andalusian Res Inst Data Sci & Computat Intelligen, Granada, Spain
[4] Iwate Prefectural Univ, Reg Res Ctr, Takizawa 0200693, Japan
基金
中国国家自然科学基金;
关键词
Sparse subspace clustering; Sparse representation; Scored nearest neighborhood; Granular computing; Multi-granularity; LOW-RANK REPRESENTATION; DIMENSIONALITY REDUCTION; ROBUST; MATRIX; MODELS; SEGMENTATION; ALGORITHM;
D O I
10.1016/j.ins.2023.119143
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Sparse subspace clustering (SSC) focuses on revealing data distribution from algebraic perspectives and has been widely applied to high-dimensional data. The key to SSC is to learn the sparsest representation and derive an adjacency graph. Theoretically, the adjacency matrix with proper block diagonal structure leads to a desired clustering result. Various generalizations have been made through imposing Laplacian regularization or locally linear embedding to describe the manifold structure based on the nearest neighborhoods of samples. However, a single set of nearest neighborhoods cannot effectively characterize local information. From the perspective of granular computing, the notion of scored nearest neighborhoods is introduced to develop multi-granularity neighborhoods of samples. The multi-granularity representation of samples is integrated with SSC to collaboratively learn the sparse representation, and an adaptive multi-granularity sparse subspace clustering model (AMGSSC) is proposed. The learned adjacency matrix has a consistent block diagonal structure at all granularity levels. Furthermore, the locally linear relationship between samples is embedded in AMGSSC, and an enhanced AMGLSSC is developed to eliminate the over-sparsity of the learned adjacency graph. Experimental results show the superior performance of both models on several clustering criteria compared with state-of-the-art subspace clustering methods.
引用
收藏
页数:26
相关论文
共 48 条
[41]   Low-rank representation with adaptive graph regularization [J].
Wen, Jie ;
Fang, Xiaozhao ;
Xu, Yong ;
Tian, Chunwei ;
Fei, Lunke .
NEURAL NETWORKS, 2018, 108 :83-96
[42]  
Wright J., 2009, C NEUR INF PROC SYST
[43]   Laplacian Regularized Low-Rank Representation and Its Applications [J].
Yin, Ming ;
Gao, Junbin ;
Lin, Zhouchen .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2016, 38 (03) :504-517
[44]   Dual Graph Regularized Latent Low-Rank Representation for Subspace Clustering [J].
Yin, Ming ;
Gao, Junbin ;
Lin, Zhouchen ;
Shi, Qinfeng ;
Guo, Yi .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 2015, 24 (12) :4918-4933
[45]   Pseudo low rank video representation [J].
Yu, Tingzhao ;
Wang, Lingfeng ;
Guo, Chaoxu ;
Gu, Huxiang ;
Xiang, Shiming ;
Pan, Chunhong .
PATTERN RECOGNITION, 2019, 85 :50-59
[46]   Fine-grained similarity fusion for Multi-view Spectral Clustering q [J].
Yu, Xiao ;
Liu, Hui ;
Wu, Yan ;
Zhang, Caiming .
INFORMATION SCIENCES, 2021, 568 :350-368
[47]   Multiple kernel low-rank representation-based robust multi-view subspace clustering [J].
Zhang, Xiaoqian ;
Ren, Zhenwen ;
Sun, Huaijiang ;
Bai, Keqiang ;
Feng, Xinghua ;
Liu, Zhigui .
INFORMATION SCIENCES, 2021, 551 :324-340
[48]   Nonnegative self-representation with a fixed rank constraint for subspace clustering [J].
Zhong, Guo ;
Pun, Chi-Man .
INFORMATION SCIENCES, 2020, 518 :127-141