Multiview nonlinear discriminant structure learning for emotion recognition

被引:1
作者
Guo, Shuai [1 ]
Song, Li [2 ,3 ]
Xie, Rong [1 ]
Li, Lin [4 ]
Liu, Shenglan [5 ,6 ]
机构
[1] Shanghai Jiao Tong Univ, Inst Image Commun & Network Engn, Shanghai 200240, Peoples R China
[2] Shanghai Jiao Tong Univ, Inst Image Commun & Network Engn, Shanghai 200240, Peoples R China
[3] Shanghai Jiao Tong Univ, AI Inst, MoE Key Lab Artificial Intelligence, Shanghai 200240, Peoples R China
[4] MIGU Co Ltd, Beijing 100120, Peoples R China
[5] Dalian Univ Technol, Sch Innovat & Entrepreneurship, Dalian 116024, Peoples R China
[6] Dalian Univ Technol, Fac Elect Informat & Elect Engn, Dalian 116024, Peoples R China
关键词
Multiview subspace learning; Emotion recognition; Nonlinear; Uncorrelated; Out-of-sample; CANONICAL CORRELATION-ANALYSIS; LAPLACIAN EIGENMAPS; MODELS; SCALE; JOINT; SETS;
D O I
10.1016/j.knosys.2022.110042
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multiview subspace learning (MSL) has been widely used in various practical applications including emotion recognition. Despite the recent progress in MSL, two challenges remain to address. First, most existing MSL methods indiscriminately utilize both helpful and defective information contained in different views. Second, the most recent methods are linear approaches that do not perform well on emotion datasets with weak linear separability. Therefore, in this study, we introduce a framework for emotion recognition: multiview nonlinear discriminant structure learning (MNDSL). MNDSL fully exploits useful information in each input through local information preservation and discriminant reconstruction (LPDR) and obtains latent subspaces using multiview discriminant latent subspace learning (MDLSL). In addition, an out-of-sample extension was introduced to satisfy the requirements of large-scale applications and obtain the projections of new samples. The proposed framework constructs interviews and intra-view-weighted connections to explore discriminant structures and preserve locality under complementarity and correlation principles. The results demonstrate the superiority of the proposed framework compared with state-of-the-art methods. (c) 2022 Elsevier B.V. All rights reserved.
引用
收藏
页数:14
相关论文
共 95 条
  • [81] Wang Y., 2022, Information Fusion
  • [82] A review of emotion sensing: categorization models and algorithms
    Wang, Zhaoxia
    Ho, Seng-Beng
    Cambria, Erik
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2020, 79 (47-48) : 35553 - 35582
  • [83] Multiview Spectral Embedding
    Xia, Tian
    Tao, Dacheng
    Mei, Tao
    Zhang, Yongdong
    [J]. IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2010, 40 (06): : 1438 - 1446
  • [84] Multi-view twin support vector machines
    Xie, Xijiong
    Sun, Shiliang
    [J]. INTELLIGENT DATA ANALYSIS, 2015, 19 (04) : 701 - 712
  • [85] Xu C, 2013, Arxiv, DOI arXiv:1304.5634
  • [86] Multi-View Intact Space Learning
    Xu, Chang
    Tao, Dacheng
    Xu, Chao
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2015, 37 (12) : 2531 - 2544
  • [87] Laplacian multiset canonical correlations for multiview feature extraction and image recognition
    Yuan, Yun-Hao
    Li, Yun
    Shen, Xiao-Bo
    Sun, Quan-Sen
    Yang, Jin-Long
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2017, 76 (01) : 731 - 755
  • [88] Graph regularized multiset canonical correlations with applications to joint feature extraction
    Yuan, Yun-Hao
    Sun, Quan-Sen
    [J]. PATTERN RECOGNITION, 2014, 47 (12) : 3907 - 3919
  • [89] Robust Multiview Subspace Learning With Nonindependently and Nonidentically Distributed Complex Noise
    Yue, Zongsheng
    Yong, Hongwei
    Meng, Deyu
    Zhao, Qian
    Leung, Yee
    Zhang, Lei
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (04) : 1070 - 1083
  • [90] Flexible Multi-View Dimensionality Co-Reduction
    Zhang, Changqing
    Fu, Huazhu
    Hu, Qinghua
    Zhu, Pengfei
    Cao, Xiaochun
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2017, 26 (02) : 648 - 659