Optimization on the Hierarchical Tucker manifold - Applications to tensor completion

被引:56
|
作者
Da Silva, Curt [1 ]
Herrmann, Felix J. [2 ]
机构
[1] Univ British Columbia, Dept Math, Vancouver, BC V5Z 1M9, Canada
[2] Univ British Columbia, Dept Earth & Ocean Sci, Vancouver, BC V5Z 1M9, Canada
基金
加拿大自然科学与工程研究理事会;
关键词
Hierarchical Tucker tensors; Tensor completion; Riemannian manifold optimization; Gauss-Newton; Differential geometry; Low-rank tensor; SINGULAR-VALUE DECOMPOSITION; APPROXIMATION; CONVERGENCE; ALGORITHM; RANK;
D O I
10.1016/j.laa.2015.04.015
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
In this work, we develop an optimization framework for problems whose solutions are well-approximated by Hierarchical Tucker (HT) tensors, an efficient structured tensor format based on recursive subspace factorizations. By exploiting the smooth manifold structure of these tensors, we construct standard optimization algorithms such as Steepest Descent and Conjugate Gradient for completing tensors from missing entries. Our algorithmic framework is fast and scalable to large problem sizes as we do not require SVDs on the ambient tensor space, as required by other methods. Moreover, we exploit the structure of the Gramian matrices associated with the HT format to regularize our problem, reducing overfitting for high subsampling ratios. We also find that the organization of the tensor can have a major impact on completion from realistic seismic acquisition geometries. These samplings are far from idealized randomized samplings that are usually considered in the literature but are realizable in practical scenarios. Using these algorithms, we successfully interpolate large-scale seismic data sets and demonstrate the competitive computational scaling of our algorithms as the problem sizes grow. (C) 2015 Elsevier Inc. All rights reserved.
引用
收藏
页码:131 / 173
页数:43
相关论文
共 50 条
  • [21] Constructing low-rank Tucker tensor approximations using generalized completion
    Petrov, Sergey
    RUSSIAN JOURNAL OF NUMERICAL ANALYSIS AND MATHEMATICAL MODELLING, 2024, 39 (02) : 113 - 119
  • [22] HIERARCHICAL TUCKER TENSOR REGRESSION: APPLICATION TO BRAIN IMAGING DATA ANALYSIS
    Hou, Ming
    Chaib-draa, Brahim
    2015 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2015, : 1344 - 1348
  • [23] Incoherent Tensor Norms and Their Applications in Higher Order Tensor Completion
    Yuan, Ming
    Zhang, Cun-Hui
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2017, 63 (10) : 6753 - 6766
  • [24] Low-rank tensor completion: a Riemannian manifold preconditioning approach
    Kasai, Hiroyuki
    Mishra, Bamdev
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 48, 2016, 48
  • [25] Smooth low-rank representation with a Grassmann manifold for tensor completion
    Su, Liyu
    Liu, Jing
    Zhang, Jianting
    Tian, Xiaoqing
    Zhang, Hailang
    Ma, Chaoqun
    KNOWLEDGE-BASED SYSTEMS, 2023, 270
  • [26] Nonnegative Tensor Completion via Low-Rank Tucker Decomposition: Model and Algorithm
    Chen, Bilian
    Sun, Ting
    Zhou, Zhehao
    Zeng, Yifeng
    Cao, Langcai
    IEEE ACCESS, 2019, 7 : 95903 - 95914
  • [27] Low Tucker rank tensor completion using a symmetric block coordinate descent method
    Yu, Quan
    Zhang, Xinzhen
    Chen, Yannan
    Qi, Liqun
    NUMERICAL LINEAR ALGEBRA WITH APPLICATIONS, 2023, 30 (03)
  • [28] Optimality conditions for Tucker low-rank tensor optimization
    Luo, Ziyan
    Qi, Liqun
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2023, 86 (03) : 1275 - 1298
  • [29] Hypercomplex Tensor Completion via Convex Optimization
    Mizoguchi, Takehiko
    Yamada, Isao
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2019, 67 (15) : 4078 - 4092
  • [30] Nonnegative Tensor Completion via Integer Optimization
    Bugg, Caleb Xavier
    Chen, Chen
    Aswani, Anil
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,