Logarithmic Schatten-p Norm Minimization for Tensorial Multi-View Subspace Clustering

被引:40
作者
Guo, Jipeng [1 ]
Sun, Yanfeng [1 ]
Gao, Junbin [2 ]
Hu, Yongli [1 ]
Yin, Baocai [1 ]
机构
[1] Beijing Univ Technol, Fac Informat Technol, Beijing Key Lab Multimedia & Intelligent Software, Beijing Inst Artificial Intelligence, Beijing 100124, Peoples R China
[2] Univ Sydney, Business Sch, Discipline Business Analyt, Camperdown, NSW 2006, Australia
基金
中国国家自然科学基金; 北京市自然科学基金; 国家重点研发计划;
关键词
Multi-view subspace clustering; Low-rank tensor representation; Tensor logarithmic Schatten-p norm; Non-convex optimiza-tion; Convergence guarantees; LOW-RANK; REPRESENTATION;
D O I
10.1109/TPAMI.2022.3179556
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The low-rank tensor could characterize inner structure and explore high-order correlation among multi-view representations, which has been widely used in multi-view clustering. Existing approaches adopt the tensor nuclear norm (TNN) as a convex approximation of non-convex tensor rank function. However, TNN treats the different singular values equally and over-penalizes the main rank components, leading to sub-optimal tensor representation. In this paper, we devise a better surrogate of tensor rank, namely the tensor logarithmic Schatten-p norm (TLSpN), which fully considers the physical difference between singular values by the non-convex and non-linear penalty function. Further, a tensor logarithmic Schatten-p norm minimization (TLSpNM)-based multi-view subspace clustering (TLSpNM-MSC) model is proposed. Specially, the proposed TLSpNM can not only protect the larger singular values encoded with useful structural information, but also remove the smaller ones encoded with redundant information. Thus, the learned tensor representation with compact low-rank structure will well explore the complementary information and accurately characterize the high-order correlation among multi-views. The alternating direction method of multipliers (ADMM) is used to solve the non-convex multi-block TLSpNM-MSC model where the challenging TLSpNM problem is carefully handled. Importantly, the algorithm convergence analysis is mathematically established by showing that the sequence generated by the algorithm is of Cauchy and converges to a Karush-Kuhn-Tucker (KKT) point. Experimental results on nine benchmark databases reveal the superiority of the TLSpNM-MSC model.
引用
收藏
页码:3396 / 3410
页数:15
相关论文
共 61 条
[1]   Deep Multimodal Subspace Clustering Networks [J].
Abavisani, Mahdi ;
Patel, Vishal M. .
IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2018, 12 (06) :1601-1614
[2]  
Amini M.-R., 2009, P INT C NEUR INF PRO, V22, P28
[3]  
[Anonymous], 2017, INT J COMPUT VISION, V121, P183
[4]  
[Anonymous], 1992, SIAM J MATRIX ANAL A, V13, P659
[5]  
[Anonymous], 2001, INT J COMPUT VISION, V42, P145
[6]  
[Anonymous], 2018, INT J COMPUT VISION, V126, P1157
[7]  
Boyd Stephen, Convex optimization
[8]   Robust Principal Component Analysis? [J].
Candes, Emmanuel J. ;
Li, Xiaodong ;
Ma, Yi ;
Wright, John .
JOURNAL OF THE ACM, 2011, 58 (03)
[9]   Diversity-induced Multi-view Subspace Clustering [J].
Cao, Xiaochun ;
Zhang, Changqing ;
Fu, Huazhu ;
Liu, Si ;
Zhang, Hua .
2015 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2015, :586-594
[10]   Generalized Nonconvex Low-Rank Tensor Approximation for Multi-View Subspace Clustering [J].
Chen, Yongyong ;
Wang, Shuqin ;
Peng, Chong ;
Hua, Zhongyun ;
Zhou, Yicong .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 2021, 30 :4022-4035