Multiview dimension reduction via Hessian multiset canonical correlations

被引:83
作者
Liu, Weifeng [1 ]
Yang, Xinghao [1 ]
Tao, Dapeng [2 ]
Cheng, Jun [3 ,4 ]
Tang, Yuanyan [5 ,6 ]
机构
[1] China Univ Petr East China, Coll Informat & Control Engn, Qingdao 266580, Shandong, Peoples R China
[2] Yunnan Univ, Sch Informat Sci & Engn, Kunming 650091, Yunnan, Peoples R China
[3] Chinese Acad Sci, Shenzhen Key Lab CVPR, Shenzhen Inst Adv Technol, Shenzhen, Guangdong, Peoples R China
[4] Chinese Univ Hong Kong, Hong Kong, Hong Kong, Peoples R China
[5] Univ Macau, Fac Sci & Technol, Macau 999078, Peoples R China
[6] Chongqing Univ, Coll Comp Sci, Chongqing 400000, Peoples R China
基金
中国国家自然科学基金;
关键词
Multiview; Dimension reduction; Hessian; Canonical correlation analysis; FUSION;
D O I
10.1016/j.inffus.2017.09.001
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Canonical correlation analysis (CCA) is a main technique of linear subspace approach for two-view dimension reduction by finding basis vectors with maximum correlation between the pair of variables. The shortcoming of the traditional CCA lies that it only handles data represented by two-view features and cannot reveal the nonlinear correlation relationship. In recent years, many variant algorithms have been developed to extend the capability of CCA such as discriminative CCA, sparse CCA, kernel CCA, locality preserving CCA and multiset canonical correlation analysis (MCCA). One representative work is Laplacian multiset canonical correlations (LapMCC) that employs graph Laplacian to exploit the nonlinear correlation information for multiview high-dimensional data. However, it possibly leads to poor extrapolating power because Laplacian regularization biases the solution towards a constant function. In this paper, we present Hessian multiset canonical correlations (HesMCC) for multiview dimension reduction. Hessian can properly exploit the intrinsic local geometry of the data manifold in contrast to Laplacian. HesMCC takes the advantage of Hessian and provides superior extrapolating capability and finally leverage the performance. Extensive experiments on several popular datasets for handwritten digits classification, face classification and object classification validate the effectiveness of the proposed HesMCC algorithm by comparing it with baseline algorithms including TCCA, KMUDA, MCCA and LapMCC. (C) 2017 Elsevier B.V. All rights reserved.
引用
收藏
页码:119 / 128
页数:10
相关论文
共 33 条
  • [1] [Anonymous], 2006, CS0609071 ARXIV
  • [2] Chen XH, 2012, ADV INTEL SOFT COMPU, V137, P199
  • [3] ON A MULTIVARIATE EIGENVALUE PROBLEM .1. ALGEBRAIC-THEORY AND A POWER METHOD
    CHU, MT
    WATTERSON, JL
    [J]. SIAM JOURNAL ON SCIENTIFIC COMPUTING, 1993, 14 (05) : 1089 - 1106
  • [4] Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data
    Donoho, DL
    Grimes, C
    [J]. PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2003, 100 (10) : 5591 - 5596
  • [5] Eells J., 1983, BIOPHYS J, V56, P517
  • [6] Foster D. P., 2008, TR20084
  • [7] Correlation Metric for Generalized Feature Extraction
    Fu, Yun
    Yan, Shuicheng
    Huang, Thomas S.
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2008, 30 (12) : 2229 - 2235
  • [8] Sparse Unsupervised Dimensionality Reduction for Multiple View Data
    Han, Yahong
    Wu, Fei
    Tao, Dacheng
    Shao, Jian
    Zhuang, Yueting
    Jiang, Jianmin
    [J]. IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2012, 22 (10) : 1485 - 1496
  • [9] Sparse canonical correlation analysis
    Hardoon, David R.
    Shawe-Taylor, John
    [J]. MACHINE LEARNING, 2011, 83 (03) : 331 - 353
  • [10] Canonical correlation analysis: An overview with application to learning methods
    Hardoon, DR
    Szedmak, S
    Shawe-Taylor, J
    [J]. NEURAL COMPUTATION, 2004, 16 (12) : 2639 - 2664