Leveraging Transformer-based autoencoders for low-rank multi-view subspace clustering

被引:0
|
作者
Lin, Yuxiu [1 ,2 ]
Liu, Hui [1 ,2 ]
Yu, Xiao [1 ,2 ]
Zhang, Caiming [2 ,3 ]
机构
[1] Shandong Univ Finance & Econ, Sch Comp Sci & Technol, Jinan 250014, Peoples R China
[2] Shandong Key Lab Lightweight Intelligent Comp & Vi, Jinan 250014, Peoples R China
[3] Shandong Univ, Sch Software, Jinan 250101, Peoples R China
关键词
Multi-view representation learning; Subspace clustering; Transformer; Weighted schatten p-norm;
D O I
10.1016/j.patcog.2024.111331
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep multi-view subspace clustering is a hot research topic, aiming to integrate multi-view information to produce accurate cluster prediction. Limited by the inherent heterogeneity of distinct views, existing works primarily rely on view-specific encoding structures for representation learning. Although effective to some extent, this approach may hinder the full exploitation of view information and increase the complexity of model training. To this end, this paper proposes a novel low-rank multi-view subspace clustering method, TALMSC, backed by Transformer-based autoencoders. Specifically, we extend the self-attention mechanism to multi-view clustering settings, developing multiple Transformer-based autoencoders that allow for modality- agnostic representation learning. Based on extracted latent representations, we deploy a sample-wise weighted fusion module that incorporates contrastive learning and orthogonal operators to formulate both consistency and diversity, consequently generating a comprehensive joint representation. Moreover, TALMSC involves a highly flexible low-rank regularizer under the weighted Schatten p-norm to constrain self-expression and better explore the low-rank structure. Extensive experiments on five multi-view datasets show that our method enjoys superior clustering performance over state-of-the-art methods.
引用
收藏
页数:10
相关论文
共 50 条
  • [21] Enhanced tensor low-rank representation learning for multi-view clustering
    Xie, Deyan
    Gao, Quanxue
    Yang, Ming
    NEURAL NETWORKS, 2023, 161 : 93 - 104
  • [22] Low-rank tensor learning with projection distance metric for multi-view clustering
    Huang, Sujia
    Fu, Lele
    Du, Shide
    Wu, Zhihao
    Vasilakos, Athanasios V.
    Wang, Shiping
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2025, 16 (01) : 25 - 41
  • [23] Hypergraph regularized low-rank tensor multi-view subspace clustering via L1 norm constraint
    Guoqing Liu
    Hongwei Ge
    Shuzhi Su
    Shuangxi Wang
    Applied Intelligence, 2023, 53 : 16089 - 16106
  • [24] Hypergraph regularized low-rank tensor multi-view subspace clustering via L1 norm constraint
    Liu, Guoqing
    Ge, Hongwei
    Su, Shuzhi
    Wang, Shuangxi
    APPLIED INTELLIGENCE, 2023, 53 (12) : 16089 - 16106
  • [25] Two Rank Approximations for Low-Rank Based Subspace Clustering
    Xu, Fei
    Peng, Chong
    Hu, Yunhong
    He, Guoping
    2017 10TH INTERNATIONAL CONGRESS ON IMAGE AND SIGNAL PROCESSING, BIOMEDICAL ENGINEERING AND INFORMATICS (CISP-BMEI), 2017,
  • [26] Nonconvex low-rank tensor approximation with graph and consistent regularizations for multi-view subspace learning
    Pan, Baicheng
    Li, Chuandong
    Che, Hangjun
    NEURAL NETWORKS, 2023, 161 : 638 - 658
  • [27] Joint local smoothness and low-rank tensor representation for robust multi-view clustering
    Du, Yangfan
    Lu, Gui-Fu
    PATTERN RECOGNITION, 2025, 157
  • [28] Error-robust multi-view subspace clustering with nonconvex low-rank tensor approximation and hyper-Laplacian graph embedding
    Pan, Baicheng
    Li, Chuandong
    Che, Hangjun
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2024, 133
  • [29] Multi-view subspace clustering based on adaptive search
    Dong, Anxue
    Wu, Zikai
    Zhang, Hongjuan
    KNOWLEDGE-BASED SYSTEMS, 2024, 289
  • [30] Multi-view subspace text clustering
    Fraj, Maha
    Hajkacem, Mohamed Aymen Ben
    Essoussi, Nadia
    JOURNAL OF INTELLIGENT INFORMATION SYSTEMS, 2024, 62 (06) : 1583 - 1606