Multi-view feature extraction based on dual contrastive heads

被引:0
|
作者
Zhang, Hongjie [1 ]
Jing, Ling [2 ,3 ]
机构
[1] Tiangong Univ, Sch Math Sci, Tianjin 300387, Peoples R China
[2] China Agr Univ, Coll Informat & Elect Engn, Beijing 100083, Peoples R China
[3] China Agr Univ, Coll Sci, Beijing 100083, Peoples R China
基金
中国国家自然科学基金;
关键词
Multi-view learning; Feature extraction; Dimensionality reduction; Contrastive learning; CANONICAL CORRELATION-ANALYSIS;
D O I
10.1016/j.neucom.2025.130092
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multi-view feature extraction can effectively address the problem of high dimensionality in multi-view data. Contrastive learning (CL), which is a popular self-supervised learning method, has recently attracted considerable attention. Most CL-based methods are only constructed at the sample level, which ignores the real structural information useful for feature extraction. In this study, we construct a structural-level contrastive loss, which promotes the consistency of the potential subspace structures in any two cross views in order to explore realistic and reliable structural information. On this basis, we propose a novel multi-view feature extraction method based on dual contrastive heads, which constructs the structural-level contrastive loss and integrates it into the sample-level CL-based method. In our method, the structural-level contrastive loss can help the sample-level contrastive loss extract discriminative features more effectively. Furthermore, the relationship between structural-level loss and mutual information, as well as the relationship between structural-level loss and probabilistic intra-class and inter-class scatter, are revealed, which provides the theoretical support for the excellent performance of our method. The final experiments on six real-world datasets demonstrate the superior performance of the proposed method compared to existing methods.
引用
收藏
页数:13
相关论文
共 50 条
  • [11] Graph Contrastive Partial Multi-View Clustering
    Wang, Yiming
    Chang, Dongxia
    Fu, Zhiqiang
    Wen, Jie
    Zhao, Yao
    IEEE TRANSACTIONS ON MULTIMEDIA, 2023, 25 (6551-6562) : 6551 - 6562
  • [12] A multi-rank two-dimensional CCA based on PDEs for multi-view feature extraction
    Yang, Jing
    Fan, Liya
    Sun, Quansen
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 242
  • [13] Contrastive Learning Based Multi-view Feature Fusion Model for Aspect-Based Sentiment Analysis
    Wu, Xing
    Xia, Hongbin
    Liu, Yuan
    Moshi Shibie yu Rengong Zhineng/Pattern Recognition and Artificial Intelligence, 2024, 37 (10): : 910 - 922
  • [14] Self-weighted dual contrastive multi-view clustering network
    Huajuan Huang
    Yanbin Mei
    Xiuxi Wei
    Yongquan Zhou
    Scientific Reports, 15 (1)
  • [15] Dual-dimensional contrastive learning for incomplete multi-view clustering
    Zhu, Zhengzhong
    Pu, Chujun
    Zhang, Xuejie
    Wang, Jin
    Zhou, Xiaobing
    NEUROCOMPUTING, 2025, 615
  • [16] Forest Fire Recognition Based on Feature Extraction from Multi-View Images
    Wu, Di
    Zhang, Chunjiong
    Ji, Li
    Ran, Rong
    Wu, Huaiyu
    Xu, Yanmin
    TRAITEMENT DU SIGNAL, 2021, 38 (03) : 775 - 783
  • [17] Subgraph feature extraction based on multi-view dictionary learning for graph classification
    Zheng, Xin
    Liang, Shouzhi
    Liu, Bo
    Xiong, Xiaoming
    Hu, Xianghong
    Liu, Yuan
    KNOWLEDGE-BASED SYSTEMS, 2021, 214
  • [18] Contrastive Multi-View Kernel Learning
    Liu, Jiyuan
    Liu, Xinwang
    Yang, Yuexiang
    Liao, Qing
    Xia, Yuanqing
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (08) : 9552 - 9566
  • [19] Structure-guided feature and cluster contrastive learning for multi-view clustering
    Shu, Zhenqiu
    Li, Bin
    Mao, Cunli
    Gao, Shengxiang
    Yu, Zhengtao
    NEUROCOMPUTING, 2024, 582
  • [20] Selective Contrastive Learning for Unpaired Multi-View Clustering
    Xin, Like
    Yang, Wanqi
    Wang, Lei
    Yang, Ming
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2025, 36 (01) : 1749 - 1763