Leveraging Transformer-based autoencoders for low-rank multi-view subspace clustering

被引:0
|
作者
Lin, Yuxiu [1 ,2 ]
Liu, Hui [1 ,2 ]
Yu, Xiao [1 ,2 ]
Zhang, Caiming [2 ,3 ]
机构
[1] Shandong Univ Finance & Econ, Sch Comp Sci & Technol, Jinan 250014, Peoples R China
[2] Shandong Key Lab Lightweight Intelligent Comp & Vi, Jinan 250014, Peoples R China
[3] Shandong Univ, Sch Software, Jinan 250101, Peoples R China
关键词
Multi-view representation learning; Subspace clustering; Transformer; Weighted schatten p-norm;
D O I
10.1016/j.patcog.2024.111331
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep multi-view subspace clustering is a hot research topic, aiming to integrate multi-view information to produce accurate cluster prediction. Limited by the inherent heterogeneity of distinct views, existing works primarily rely on view-specific encoding structures for representation learning. Although effective to some extent, this approach may hinder the full exploitation of view information and increase the complexity of model training. To this end, this paper proposes a novel low-rank multi-view subspace clustering method, TALMSC, backed by Transformer-based autoencoders. Specifically, we extend the self-attention mechanism to multi-view clustering settings, developing multiple Transformer-based autoencoders that allow for modality- agnostic representation learning. Based on extracted latent representations, we deploy a sample-wise weighted fusion module that incorporates contrastive learning and orthogonal operators to formulate both consistency and diversity, consequently generating a comprehensive joint representation. Moreover, TALMSC involves a highly flexible low-rank regularizer under the weighted Schatten p-norm to constrain self-expression and better explore the low-rank structure. Extensive experiments on five multi-view datasets show that our method enjoys superior clustering performance over state-of-the-art methods.
引用
收藏
页数:10
相关论文
共 50 条
  • [41] Feature concatenation multi-view subspace clustering
    Zheng, Qinghai
    Zhu, Jihua
    Li, Zhongyu
    Pang, Shanmin
    Wang, Jun
    Li, Yaochen
    NEUROCOMPUTING, 2020, 379 : 89 - 102
  • [42] Subspace-Contrastive Multi-View Clustering
    Fu, Lele
    Huang, Sheng
    Zhang, Lei
    Yang, Jinghua
    Zheng, Zibin
    Zhang, Chuanfu
    Chen, Chuan
    ACM TRANSACTIONS ON KNOWLEDGE DISCOVERY FROM DATA, 2024, 18 (09)
  • [43] Generalized Latent Multi-View Subspace Clustering
    Zhang, Changqing
    Fu, Huazhu
    Hu, Qinghua
    Cao, Xiaochun
    Xie, Yuan
    Tao, Dacheng
    Xu, Dong
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2020, 42 (01) : 86 - 99
  • [44] Low-rank representation with graph regularization for subspace clustering
    He, Wu
    Chen, Jim X.
    Zhang, Weihua
    SOFT COMPUTING, 2017, 21 (06) : 1569 - 1581
  • [45] Robust discriminant low-rank representation for subspace clustering
    Xian Zhao
    Gaoyun An
    Yigang Cen
    Hengyou Wang
    Ruizhen Zhao
    Soft Computing, 2019, 23 : 7005 - 7013
  • [46] Subspace Clustering via Adaptive Low-Rank Model
    Zhao, Mingbo
    Cheng, Wenlong
    Zhang, Zhao
    Zhan, Choujun
    NEURAL INFORMATION PROCESSING (ICONIP 2017), PT VI, 2017, 10639 : 462 - 470
  • [47] Constrained Low-Rank Representation for Robust Subspace Clustering
    Wang, Jing
    Wang, Xiao
    Tian, Feng
    Liu, Chang Hong
    Yu, Hongchuan
    IEEE TRANSACTIONS ON CYBERNETICS, 2017, 47 (12) : 4534 - 4546
  • [48] Subspace clustering using a symmetric low-rank representation
    Chen, Jie
    Mao, Hua
    Sang, Yongsheng
    Yi, Zhang
    KNOWLEDGE-BASED SYSTEMS, 2017, 127 : 46 - 57
  • [49] Subspace clustering using a low-rank constrained autoencoder
    Chen, Yuanyuan
    Zhang, Lei
    Yi, Zhang
    INFORMATION SCIENCES, 2018, 424 : 27 - 38
  • [50] Low-rank sparse subspace clustering with a clean dictionary
    You, Cong-Zhe
    Shu, Zhen-Qiu
    Fan, Hong-Hui
    JOURNAL OF ALGORITHMS & COMPUTATIONAL TECHNOLOGY, 2021, 15