Exploiting Low-Rank Tensor-Train Deep Neural Networks Based on Riemannian Gradient Descent With Illustrations of Speech Processing

被引:8
|
作者
Qi, Jun [1 ,2 ]
Yang, Chao-Han Huck [2 ]
Chen, Pin-Yu [3 ]
Tejedor, Javier [4 ]
机构
[1] Fudan Univ, Sch Informat Sci & Engn, Dept Elect Engn, Shanghai 200438, Peoples R China
[2] Georgia Inst Technol, Sch Elect & Comp Engn, Atlanta, GA 30332 USA
[3] IBM Res, Yorktown Height, NY 10598 USA
[4] CEU Univ, Univ San Pablo CEU, Inst Technol, Boadilla Del Monte 28668, Spain
关键词
Tensor-train network; speech enhancement; spoken command recognition; Riemannian gradient descent; low-rank tensor-train decomposition; tensor-train deep neural network; MEAN ABSOLUTE ERROR; ALGORITHMS; RMSE; MAE;
D O I
10.1109/TASLP.2022.3231714
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
This work focuses on designing low-complexity hybrid tensor networks by considering trade-offs between the model complexity and practical performance. Firstly, we exploit a low-rank tensor-train deep neural network (TT-DNN) to build an end-to-end deep learning pipeline, namely LR-TT-DNN. Secondly, a hybrid model combining LR-TT-DNN with a convolutional neural network (CNN), which is denoted as CNN+(LR-TT-DNN), is set up to boost the performance. Instead of randomly assigning large TT-ranks for TT-DNN, we leverage Riemannian gradient descent to determine a TT-DNN associated with small TT-ranks. Furthermore, CNN+(LR-TT-DNN) consists of convolutional layers at the bottom for feature extraction and several TT layers at the top to solve regression and classification problems. We separately assess the LR-TT-DNN and CNN+(LR-TT-DNN) models on speech enhancement and spoken command recognition tasks. Our empirical evidence demonstrates that the LR-TT-DNN and CNN+(LR-TT-DNN) models with fewer model parameters can outperform the TT-DNN and CNN+(TT-DNN) counterparts.
引用
收藏
页码:633 / 642
页数:10
相关论文
共 50 条
  • [1] AUTOMATIC DIFFERENTIATION FOR RIEMANNIAN OPTIMIZATION ON LOW-RANK MATRIX AND TENSOR-TRAIN MANIFOLDS
    NOVIKOV, A. L. E. X. A. N. D. E. R.
    RAKHUBA, M. A. X. I. M.
    OSELEDETS, I. V. A. N.
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2022, 44 (02): : A843 - A869
  • [2] Nonnegative Tensor-Train Low-Rank Approximations of the Smoluchowski Coagulation Equation
    Manzini, Gianmarco
    Skau, Erik
    Truong, Duc P.
    Vangara, Raviteja
    LARGE-SCALE SCIENTIFIC COMPUTING (LSSC 2021), 2022, 13127 : 342 - 350
  • [3] Auto-weighted robust low-rank tensor completion via tensor-train
    Chen, Chuan
    Wu, Zhe-Bin
    Chen, Zi-Tai
    Zheng, Zi-Bin
    Zhang, Xiong-Jun
    INFORMATION SCIENCES, 2021, 567 : 100 - 115
  • [4] Tensor Completion using Low-Rank Tensor Train Decomposition by Riemannian Optimization
    Wang, Junli
    Zhao, Guangshe
    Wang, Dingheng
    Li, Guoqi
    2019 CHINESE AUTOMATION CONGRESS (CAC2019), 2019, : 3380 - 3384
  • [5] Graph Regularized Low-Rank Tensor-Train for Robust Principal Component Analysis
    Sofuoglu, Seyyid Emre
    Aviyente, Selin
    IEEE SIGNAL PROCESSING LETTERS, 2022, 29 : 1152 - 1156
  • [6] Riemannian conjugate gradient method for low-rank tensor completion
    Duan, Shan-Qi
    Duan, Xue-Feng
    Li, Chun-Mei
    Li, Jiao-Fen
    ADVANCES IN COMPUTATIONAL MATHEMATICS, 2023, 49 (03)
  • [7] Riemannian conjugate gradient method for low-rank tensor completion
    Shan-Qi Duan
    Xue-Feng Duan
    Chun-Mei Li
    Jiao-Fen Li
    Advances in Computational Mathematics, 2023, 49
  • [8] A PRECONDITIONED RIEMANNIAN GRADIENT DESCENT ALGORITHM FOR LOW-RANK MATRIX RECOVERY
    Bian, Fengmiao
    Cai, Jian-feng
    Zhang, Rui
    SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2024, 45 (04) : 2075 - 2103
  • [9] Subspace Methods with Local Refinements for Eigenvalue Computation Using Low-Rank Tensor-Train Format
    Zhang, Junyu
    Wen, Zaiwen
    Zhang, Yin
    JOURNAL OF SCIENTIFIC COMPUTING, 2017, 70 (02) : 478 - 499
  • [10] Challenging the Curse of Dimensionality in Multidimensional Numerical Integration by Using a Low-Rank Tensor-Train Format
    Alexandrov, Boian
    Manzini, Gianmarco
    Skau, Erik W.
    Truong, Phan Minh Duc
    Vuchov, Radoslav G.
    MATHEMATICS, 2023, 11 (03)