BLOCK-RANDOMIZED STOCHASTIC PROXIMAL GRADIENT FOR CONSTRAINED LOW-RANK TENSOR FACTORIZATION

被引:0
|
作者
Fu, Xiao [1 ]
Gao, Cheng [1 ]
Wai, Hoi-To [2 ]
Huang, Kejun [3 ]
机构
[1] Oregon State Univ, Sch EECS, Corvallis, OR 97331 USA
[2] Chinese Univ Hong Kong, Dept SEEM, Hong Kong, Peoples R China
[3] Univ Florida, Dept CISE, Gainesville, FL USA
来源
2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP) | 2019年
基金
美国国家科学基金会;
关键词
Canonical polyadic decomposition; PARAFAC; stochastic optimization;
D O I
暂无
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
This work focuses on canonical polyadic decomposition (CPD) for large-scale tensors. Many prior works rely on data sparsity to develop scalable CPD algorithms, which are not suitable for handling dense tensor, while dense tensors often arise in applications such as image and video processing. As an alternative, stochastic algorithms utilize data sampling to reduce per-iteration complexity and thus are very scalable, even when handling dense tensors. However, existing stochastic CPD algorithms are facing some challenges. For example, some algorithms are based on randomly sampled tensor entries, and thus each iteration can only updates a small portion of the latent factors. This may result in slow improvement of the estimation accuracy of the latent factors. In addition, the convergence properties of many stochastic CPD algorithms are unclear, perhaps because CPD poses a hard nonconvex problem and is challenging for analysis under stochastic settings. In this work, we propose a stochastic optimization strategy that can effectively circumvent the above challenges. The proposed algorithm updates a whole latent factor at each iteration using sampled fibers of a tensor, which can quickly increase the estimation accuracy. The algorithm is flexible-many commonly used regularizers and constraints can be easily incorporated in the computational framework. The algorithm is also backed by a rigorous convergence theory. Simulations on large-scale dense tensors are employed to showcase the effectiveness of the algorithm.
引用
收藏
页码:7485 / 7489
页数:5
相关论文
共 22 条
  • [1] Block-Randomized Stochastic Proximal Gradient for Low-Rank Tensor Factorization
    Fu, Xiao
    Ibrahim, Shahana
    Wai, Hoi-To
    Gao, Cheng
    Huang, Kejun
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2020, 68 (68) : 2170 - 2185
  • [2] A MOMENTUM BLOCK-RANDOMIZED STOCHASTIC ALGORITHM FOR LOW-RANK TENSOR CP DECOMPOSITION
    Wang, Qingsong
    Cui, Chunfeng
    Han, Deren
    PACIFIC JOURNAL OF OPTIMIZATION, 2021, 17 (03): : 433 - 452
  • [3] Stochastic Mirror Descent for Low-Rank Tensor Decomposition Under Non-Euclidean Losses
    Pu, Wenqiang
    Ibrahim, Shahana
    Fu, Xiao
    Hong, Mingyi
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2022, 70 : 1803 - 1818
  • [4] APPROXIMATE RANK-DETECTING FACTORIZATION OF LOW-RANK TENSORS
    Kiraly, Franz J.
    Ziehe, Andreas
    2013 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2013, : 3938 - 3942
  • [5] Block Row Kronecker-Structured Linear Systems With a Low-Rank Tensor Solution
    Hendrikx, Stijn
    De Lathauwer, Lieven
    FRONTIERS IN APPLIED MATHEMATICS AND STATISTICS, 2022, 8
  • [6] Reducing the Control Overhead of Intelligent Reconfigurable Surfaces via a Tensor-Based Low-Rank Factorization Approach
    Sokal, Bruno
    Gomes, Paulo R. B.
    de Almeida, Andre L. F.
    Makki, Behrooz
    Fodor, Gabor
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2023, 22 (10) : 6578 - 6593
  • [7] TENSOR RANK AND THE ILL-POSEDNESS OF THE BEST LOW-RANK APPROXIMATION PROBLEM
    de Silva, Vin
    Lim, Lek-Heng
    SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2008, 30 (03) : 1084 - 1127
  • [8] Robust to Rank Selection: Low-Rank Sparse Tensor-Ring Completion
    Yu, Jinshi
    Zhou, Guoxu
    Sun, Weijun
    Xie, Shengli
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (05) : 2451 - 2465
  • [9] Tensor Convolutional Dictionary Learning With CP Low-Rank Activations
    Humbert, Pierre
    Oudre, Laurent
    Vayatis, Nicolas
    Audiffren, Julien
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2022, 70 : 785 - 796
  • [10] CANONICAL POLYADIC TENSOR DECOMPOSITION WITH LOW-RANK FACTOR MATRICES
    Anh-Huy Phan
    Tichavsky, Petr
    Sobolev, Konstantin
    Sozykin, Konstantin
    Ermilov, Dmitry
    Cichocki, Andrzej
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 4690 - 4694