Multi-view subspace clustering with Kronecker-basis-representation-based tensor sparsity measure

被引:3
|
作者
Lu, Gui-Fu [1 ]
Li, Hua [1 ]
Wang, Yong [1 ]
Tang, Ganyi [1 ]
机构
[1] AnHui Polytech Univ, Sch Comp Sci & Informat, Wuhu 241000, Anhui, Peoples R China
基金
安徽省自然科学基金;
关键词
Multi-view features; Subspace clustering; Tucker decomposition; CANDECOMP; PARAFAC (CP) decomposition;
D O I
10.1007/s00138-021-01247-w
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multi-view data are popular in many machine learning and computer vision applications. For example, in computer vision fields, one object can be described with images, text or videos. Recently, multi-view subspace clustering approaches, which can make use of the complementary information among different views to improve the performance of clustering, have attracted much attention. In this paper, we propose a novel multi-view subspace clustering method with Kronecker-basis-representation-based tensor sparsity measure (MSC-KBR) to address multi-view subspace clustering problem. In the MSC-KBR model, we first construct a tensor based on the subspace representation matrices of different views, and, then the high-order correlations underlying different views can be explored. We also adopt a novel Kronecker-basis-representation-based tensor sparsity measure (KBR) to the constructed tensor to reduce the redundancy of the learned subspace representations and improve the accuracy of clustering. Different from the traditional unfolding-based tensor norm, KBR can encode both sparsity insights delivered by Tucker and CANDECOMP/PARAFAC decompositions for a general tensor. By using the augmented Lagrangian method, an efficient algorithm is presented to solve the optimization problem of the MSC-KBR model. The experimental results on some datasets show that the proposed MSC-KBR model outperforms many state-of-the-art multi-view clustering approaches.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] Multi-view subspace clustering with Kronecker-basis-representation-based tensor sparsity measure
    Gui-Fu Lu
    Hua Li
    Yong Wang
    Ganyi Tang
    Machine Vision and Applications, 2021, 32
  • [2] Multi-View Robust Tensor-Based Subspace Clustering
    Al-Sharoa, Esraa M.
    Al-Wardat, Mohammad A.
    IEEE ACCESS, 2022, 10 : 134292 - 134306
  • [3] Weighted Low-Rank Tensor Representation for Multi-View Subspace Clustering
    Wang, Shuqin
    Chen, Yongyong
    Zheng, Fangying
    FRONTIERS IN PHYSICS, 2021, 8
  • [4] Multi-view Subspace Clustering with Joint Tensor Representation and Indicator Matrix Learning
    Wang, Jing
    Zhang, Xiaoqian
    Liu, Zhigui
    Yue, Zhuang
    Huang, Zhengliang
    ARTIFICIAL INTELLIGENCE, CICAI 2022, PT II, 2022, 13605 : 450 - 461
  • [5] LOW-RANK AND SPARSE TENSOR REPRESENTATION FOR MULTI-VIEW SUBSPACE CLUSTERING
    Wang, Shuqin
    Chen, Yongyong
    Cen, Yigang
    Zhang, Linna
    Voronin, Viacheslav
    2021 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2021, : 1534 - 1538
  • [6] Multi-view Subspace Clustering Based on Unified Measure Standard
    Tang, Kewei
    Wang, Xiaoru
    Li, Jinhong
    NEURAL PROCESSING LETTERS, 2023, 55 (05) : 6231 - 6246
  • [7] Multi-view Subspace Clustering Based on Unified Measure Standard
    Kewei Tang
    Xiaoru Wang
    Jinhong Li
    Neural Processing Letters, 2023, 55 : 6231 - 6246
  • [8] Nonconvex low-rank and sparse tensor representation for multi-view subspace clustering
    Shuqin Wang
    Yongyong Chen
    Yigang Cen
    Linna Zhang
    Hengyou Wang
    Viacheslav Voronin
    Applied Intelligence, 2022, 52 : 14651 - 14664
  • [9] Constrained Tensor Representation Learning for Multi-View Semi-Supervised Subspace Clustering
    Tang, Yongqiang
    Xie, Yuan
    Zhang, Chenyang
    Zhang, Wensheng
    IEEE TRANSACTIONS ON MULTIMEDIA, 2022, 24 : 3920 - 3933
  • [10] Nonconvex low-rank and sparse tensor representation for multi-view subspace clustering
    Wang, Shuqin
    Chen, Yongyong
    Cen, Yigang
    Zhang, Linna
    Wang, Hengyou
    Voronin, Viacheslav
    APPLIED INTELLIGENCE, 2022, 52 (13) : 14651 - 14664