Robust Least Squares Regression for Subspace Clustering: A Multi-View Clustering Perspective

被引:7
作者
Du, Yangfan [1 ]
Lu, Gui-Fu [1 ]
Ji, Guangyan [1 ]
机构
[1] Anhui Polytech Univ, Sch Comp Sci & Informat, Wuhu 241000, Anhui, Peoples R China
关键词
Affinity matrix; least squares regression; subspace clustering; tensor; ALGORITHM;
D O I
10.1109/TIP.2023.3327564
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recently, with the assumption that samples can be reconstructed by themselves, subspace clustering (SC) methods have achieved great success. Generally, SC methods contain some parameters to be tuned, and different affinity matrices can obtain with different parameter values. In this paper, for the first time, we study a method for fusing these different affinity matrices to promote clustering performance and provide the corresponding solution from a multi-view clustering (MVC) perspective. That is, we argue that the different affinity matrices are consistent and complementary, which is similar to the fundamental assumption of MVC methods. Based on this observation, in this paper, we use least squares regression (LSR), which is a typical SC method, as an example since it can be efficiently optimized and has shown good clustering performance and we propose a novel robust least squares regression method from an MVC perspective (RLSR/MVCP). Specifically, we first utilize LSR with different parameter values to obtain different affinity matrices. Then, to fully explore the information contained in these different affinity matrices and to remove noise, we further fuse these affinity matrices into a tensor, which is constrained by the tensor low-rank constraint, i.e., the tensor nuclear norm (TNN). The two steps are combined into a framework that is solved by the augmented Lagrange multiplier (ALM) method. The experimental results on several datasets indicate that RLSR/MVCP has very encouraging clustering performance and is superior to state-of-the-art SC methods.
引用
收藏
页码:216 / 227
页数:12
相关论文
共 54 条
[1]   l0-Motivated Low-Rank Sparse Subspace Clustering [J].
Brbic, Maria ;
Kopriva, Ivica .
IEEE TRANSACTIONS ON CYBERNETICS, 2020, 50 (04) :1711-1725
[2]   A Multi-View Co-Training Clustering Algorithm Based on Global and Local Structure Preserving [J].
Cai, Weiling ;
Zhou, Honghan ;
Xu, Le .
IEEE ACCESS, 2021, 9 :29293-29302
[3]   Incomplete multi-view clustering with multiple imputation and ensemble clustering [J].
Chao, Guoqing ;
Wang, Songtao ;
Yang, Shiming ;
Li, Chunshan ;
Chu, Dianhui .
APPLIED INTELLIGENCE, 2022, 52 (13) :14811-14821
[4]   Low-rank representation with adaptive dictionary learning for subspace clustering [J].
Chen, Jie ;
Mao, Hua ;
Wang, Zhu ;
Zhang, Xinpei .
KNOWLEDGE-BASED SYSTEMS, 2021, 223
[5]  
Chen MS, 2020, AAAI CONF ARTIF INTE, V34, P3513
[6]   Tensor low-rank sparse representation for tensor subspace learning [J].
Du, Shiqiang ;
Shi, Yuqing ;
Shan, Guangrong ;
Wang, Weilan ;
Ma, Yide .
NEUROCOMPUTING, 2021, 440 :351-364
[7]   Sparse Subspace Clustering: Algorithm, Theory, and Applications [J].
Elhamifar, Ehsan ;
Vidal, Rene .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2013, 35 (11) :2765-2781
[8]   Combining multiple clusterings using evidence accumulation [J].
Fred, ALN ;
Jain, AK .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2005, 27 (06) :835-850
[9]   An overview of recent multi-view clustering [J].
Fu, Lele ;
Lin, Pengfei ;
Vasilakos, Athanasios V. ;
Wang, Shiping .
NEUROCOMPUTING, 2020, 402 (402) :148-161
[10]   Multi-View Subspace Clustering [J].
Gao, Hongchang ;
Nie, Feiping ;
Li, Xuelong ;
Huang, Heng .
2015 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2015, :4238-4246