Implicit Regularization in Deep Tensor Factorization

被引:3
|
作者
Milanesi, Paolo [1 ]
Kadri, Hachem [1 ]
Ayache, Stephan [1 ]
Artieres, Thierry [1 ,2 ]
机构
[1] Aix Marseille Univ, Univ Toulon, CNRS, LIS, Marseille, France
[2] Ecole Cent Marseille, Marseille, France
来源
2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN) | 2021年
关键词
tensor factorization; deep learning; Tucker decomposition; tensor-train; effective rank; APPROXIMATION;
D O I
10.1109/IJCNN52387.2021.9533690
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Attempts of studying implicit regularization associated to gradient descent (GD) have identified matrix completion as a suitable test-bed. Late findings suggest that this phenomenon cannot be phrased as a minimization-norm problem, implying that a paradigm shift is required and that dynamics has to be taken into account. In the present work we address the more general setup of tensor completion by leveraging two popularized tensor factorization, namely Tucker and TensorTrain (TT). We track relevant quantities such as tensor nuclear norm, effective rank, generalized singular values and we introduce deep Tucker and TT unconstrained factorization to deal with the completion task. Experiments on both synthetic and real data show that gradient descent promotes solution with low-rank, and validate the conjecture saying that the phenomenon has to be addressed from a dynamical perspective.
引用
收藏
页数:8
相关论文
共 50 条
  • [1] Implicit Regularization with Polynomial Growth in Deep Tensor Factorization
    Hariz, Kais
    Kadri, Hachem
    Ayache, Stephane
    Moakher, Maher
    Artieres, Thierry
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [2] Implicit Regularization in Tensor Factorization
    Razin, Noam
    Maman, Asaf
    Cohen, Nadav
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [3] Implicit Regularization in Hierarchical Tensor Factorization and Deep Convolutional Neural Networks
    Razin, Noam
    Maman, Asaf
    Cohen, Nadav
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [4] Implicit Regularization in Deep Matrix Factorization
    Arora, Sanjeev
    Cohen, Nadav
    Hu, Wei
    Luo, Yuping
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [5] Implicit Regularization in Matrix Factorization
    Gunasekar, Suriya
    Woodworth, Blake
    Bhojanapalli, Srinadh
    Neyshabur, Behnam
    Srebro, Nathan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [6] Implicit Regularization in Matrix Factorization
    Gunasekar, Suriya
    Woodworth, Blake
    Bhojanapalli, Srinadh
    Neyshabur, Behnam
    Srebro, Nathan
    2018 INFORMATION THEORY AND APPLICATIONS WORKSHOP (ITA), 2018,
  • [7] Implicit Regularization in Deep Tucker Factorization: Low-Rankness via Structured Sparsity
    Hariz, Kais
    Kadri, Hachem
    Ayache, Stephane
    Moakher, Maher
    Artieres, Thierry
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238, 2024, 238
  • [8] Streaming Bayesian Deep Tensor Factorization
    Fang, Shikai
    Wang, Zheng
    Pan, Zhimeng
    Liu, Ji
    Zhe, Shandian
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [9] DEEP TENSOR FACTORIZATION FOR HYPERSPECTRAL IMAGE CLASSIFICATION
    Chen, Jingzhou
    Zhang, Wei
    Qian, Yuntao
    Ye, Minchao
    IGARSS 2018 - 2018 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM, 2018, : 4788 - 4791
  • [10] STELAR: Spatio-temporal Tensor Factorization with Latent Epidemiological Regularization
    Kargas, Nikos
    Qian, Cheng
    Sidiropoulos, Nicholas D.
    Xiao, Cao
    Glass, Lucas M.
    Sun, Jimeng
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 4830 - 4837