Multi-view dimensionality reduction learning with hierarchical sparse feature selection

被引:1
作者
Guo, Wei [1 ,2 ]
Wang, Zhe [1 ,2 ]
Yang, Hai [2 ]
Du, Wenli [1 ]
机构
[1] East China Univ Sci & Technol, Minist Educ, Key Lab Smart Mfg Energy Chem Proc, Shanghai 200237, Peoples R China
[2] East China Univ Sci & Technol, Dept Comp Sci & Engn, Shanghai 200237, Peoples R China
基金
美国国家科学基金会;
关键词
Multi-view learning; Dimensionality reduction; View selection; Feature selection;
D O I
10.1007/s10489-022-04161-4
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multi-view data can depict samples from various views and learners can benefit from such complementary information, so it has attracted extensive studies in recent years. However, it always locates in high-dimensional space and brings noisy or redundant views and features into the learning process, which can decrease the performance of the learner. To address the above issue, we propose a novel unsupervised Multi-view Dimensionality Reduction learning framework with Hierarchical Sparse Feature Selection (MvDRHSFS) to learn a low-dimensional subspace by jointly selecting the most informative views and features hierarchically. More specifically, we penalize the projection matrix with Frobenius norm (F-norm) and l(2,1)-norm to select the most informative views and features hierarchically. Under the penalty of the two regularization terms, some projection-based Sigle-view Dimensionality Reduction (SvDR) methods can learn a more meaningful low-dimensional subspace of multi-view data. In practical implementation, we use the regression type of PCA and relax the orthogonal constraint of the projection matrix to learn the low-dimensional subspace in a more flexible way. To find the optimal solution of the proposed learning framework, we derive an effective way to optimize the given formulation and give the theoretical analysis about the convergence for the optimization algorithm. Extensive experiment results on several real-world datasets demonstrate the feasibility and superiority of our proposed learning framework.
引用
收藏
页码:12774 / 12791
页数:18
相关论文
共 50 条
  • [1] Multi-view dimensionality reduction learning with hierarchical sparse feature selection
    Wei Guo
    Zhe Wang
    Hai Yang
    Wenli Du
    Applied Intelligence, 2023, 53 : 12774 - 12791
  • [2] Multi-View Multi-Label Learning With Sparse Feature Selection for Image Annotation
    Zhang, Yongshan
    Wu, Jia
    Cai, Zhihua
    Yu, Philip S.
    IEEE TRANSACTIONS ON MULTIMEDIA, 2020, 22 (11) : 2844 - 2857
  • [3] Multi-view dimensionality reduction based on Universum learning
    Chen, Xiaohong
    Yin, Hujun
    Jiang, Fan
    Wang, Liping
    NEUROCOMPUTING, 2018, 275 : 2279 - 2286
  • [4] Weighted feature selection via discriminative sparse multi-view learning
    Zhong, Jing
    Wang, Nan
    Lin, Qiang
    Zhong, Ping
    KNOWLEDGE-BASED SYSTEMS, 2019, 178 : 132 - 148
  • [5] Hierarchical unsupervised multi-view feature selection
    Chen, Tingjian
    Yuan, Haoliang
    Yin, Ming
    INTERNATIONAL JOURNAL OF WAVELETS MULTIRESOLUTION AND INFORMATION PROCESSING, 2022, 20 (06)
  • [6] Multi-view feature selection via sparse tensor regression
    Yuan, Haoliang
    Lo, Sio-Long
    Yin, Ming
    Liang, Yong
    INTERNATIONAL JOURNAL OF WAVELETS MULTIRESOLUTION AND INFORMATION PROCESSING, 2021, 19 (05)
  • [7] Robust Multi-View Feature Selection
    Liu, Hongfu
    Mao, Haiyi
    Fu, Yun
    2016 IEEE 16TH INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2016, : 281 - 290
  • [8] Multi-View Projection Learning via Adaptive Graph Embedding for Dimensionality Reduction
    Li, Haohao
    Gao, Mingliang
    Wang, Huibing
    Jeon, Gwanggil
    ELECTRONICS, 2023, 12 (13)
  • [9] Adaptive graph weighting for multi-view dimensionality reduction
    Xu, Xinyi
    Yang, Yanhua
    Deng, Cheng
    Nie, Feiping
    SIGNAL PROCESSING, 2019, 165 : 186 - 196
  • [10] Multi-view Laplacian Sparse Feature Selection for Web Image Annotation
    Shi Caijuan
    Ruan Qiuqi
    An Gaoyun
    2014 12TH INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING (ICSP), 2014, : 1026 - 1029