Stochastic Sparse Subspace Clustering

被引:63
作者
Chen, Ying [1 ]
Li, Chun-Guang [1 ]
You, Chong [2 ]
机构
[1] Beijing Univ Posts & Telecommun, SICE, Beijing, Peoples R China
[2] Univ Calif Berkeley, EECS, Berkeley, CA 94720 USA
来源
2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR) | 2020年
基金
中国国家自然科学基金;
关键词
SEGMENTATION; ROBUST;
D O I
10.1109/CVPR42600.2020.00421
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
State-of-the-art subspace clustering methods are based on self-expressive model, which represents each data point as a linear combination of other data points. By enforcing such representation to be sparse, sparse subspace clustering is guaranteed to produce a subspace-preserving data affinity where two points are connected only if they are from the same subspace. On the other hand, however, data points from the same subspace may not be well-connected, leading to the issue of over-segmentation. We introduce dropout to address the issue of over-segmentation, which is based on randomly dropping out data points in self-expressive model. In particular, we show that dropout is equivalent to adding a squared l(2) norm regularization on the representation coefficients, therefore induces denser solutions. Then, we reformulate the optimization problem as a consensus problem over a set of small-scale subproblems. This leads to a scalable and flexible sparse subspace clustering approach, termed Stochastic Sparse Subspace Clustering, which can effectively handle large scale datasets. Extensive experiments on synthetic data and real world datasets validate the efficiency and effectiveness of our proposal.
引用
收藏
页码:4154 / 4163
页数:10
相关论文
共 62 条
[1]   Scalable and robust sparse subspace clustering using randomized clustering and multilayer graphs [J].
Abdolali, Maryam ;
Gillis, Nicolas ;
Rahmati, Mohammad .
SIGNAL PROCESSING, 2019, 163 :166-180
[2]  
Agarwal P., 2004, ACM S PRINC DAT SYST
[3]  
[Anonymous], 1996, Tech. Rep. CUCS-006-96
[4]  
[Anonymous], 1993, ORTHOGONAL MATCHING
[5]  
[Anonymous], 1991, Graph theory, combinatorics, and applications, DOI DOI 10.1016/J.CAMWA.2004.05.005
[6]  
[Anonymous], 2013, NEURAL INFORM PROCES
[7]  
[Anonymous], 2017, ARXIV170904744
[8]  
[Anonymous], 2016, INTERDISCIP APPL MAT, DOI [DOI 10.1007/978-0-387-87811-9_2, 10.1007/978-0-387-87811-9]
[9]  
[Anonymous], 2013, INT C MACHINE LEARNI
[10]  
Baldi P, 2013, P 26 INT C NEUR INF