Self-paced latent embedding space learning for multi-view clustering

被引:2
作者
Li, Haoran [1 ]
Ren, Zhenwen [1 ,2 ]
Zhao, Chunyu [1 ]
Xu, Zhi [3 ]
Dai, Jian [4 ]
机构
[1] Southwest Univ Sci & Technol, Sch Informat Engn, Sch Natl Def Sci & Technol, Mianyang 621010, Sichuan, Peoples R China
[2] Nanjing Univ, State Key Lab Novel Software Technol, Nanjing 210008, Peoples R China
[3] Guilin Univ Elect Technol, Guangxi Key Lab Images & Graph Intelligent Proc, Guilin 541004, Peoples R China
[4] China South Ind Grp Corp, Southwest Automat Res Inst, Mianyang 621000, Sichuan, Peoples R China
基金
中国国家自然科学基金;
关键词
Multi-view learning; Affinity matrix; Embedding space; Self-paced learning; Clustering; REPRESENTATION; TENSOR;
D O I
10.1007/s13042-022-01600-z
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multi-view clustering (MVC) can integrate the complementary information between different views to remarkably improve clustering performance. However, the existing methods suffer from the following drawbacks: (1) multi-view data are often lying on high-dimensional space and inevitably corrupted by noise and even outliers, which poses challenges for fully exploiting the intrinsic structure of views; (2) the non-convex objective functions prone to becoming stuck into bad local minima; and (3) the high-order structure information has been largely ignored, resulting in suboptimal solution. To alleviate these problems, this paper proposes a novel method, namely Self-paced Latent Embedding Space Learning (SLESL). Specifically, the views are projected into a latent embedding space to dimensional-reduce and clean the data, from simplicity to complexity in a self-paced manner. Meanwhile, multiple candidate graphs are learned in the latent space by using embedded self-expressiveness learning. After that, these graphs are stacked into a tensor to exploit the high-order structure information of views, such that a refined consensus affinity graph can be obtained for spectral clustering. The experimental results demonstrate the effectiveness of our proposed method.
引用
收藏
页码:3373 / 3386
页数:14
相关论文
共 46 条
[1]  
[Anonymous], 2011, Advances in Neural Information Processing Systems
[2]  
[Anonymous], 2015, ARXIV151106049
[3]  
Bengio Y., 2009, P 26 ANN INT C MACH, P41, DOI DOI 10.1145/1553374.1553380
[4]   Distributed optimization and statistical learning via the alternating direction method of multipliers [J].
Boyd S. ;
Parikh N. ;
Chu E. ;
Peleato B. ;
Eckstein J. .
Foundations and Trends in Machine Learning, 2010, 3 (01) :1-122
[5]  
Burden R. L., 2015, Numerical analysis, V10th
[6]  
Burden R. L., 1993, Numerical analysis, V5th
[7]  
Chen MS, 2020, AAAI CONF ARTIF INTE, V34, P3513
[8]   Multi-View Subspace Clustering [J].
Gao, Hongchang ;
Nie, Feiping ;
Li, Xuelong ;
Huang, Heng .
2015 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2015, :4238-4246
[9]  
Gao QX, 2020, AAAI CONF ARTIF INTE, V34, P3930
[10]  
He XF, 2004, ADV NEUR IN, V16, P153