Implicit Regularization in Hierarchical Tensor Factorization and Deep Convolutional Neural Networks

被引:0
|
作者
Razin, Noam [1 ]
Maman, Asaf [1 ]
Cohen, Nadav [1 ]
机构
[1] Tel Aviv Univ, Blavatnik Sch Comp Sci, Tel Aviv, Israel
来源
INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162 | 2022年
基金
以色列科学基金会;
关键词
DECOMPOSITION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In the pursuit of explaining implicit regularization in deep learning, prominent focus was given to matrix and tensor factorizations, which correspond to simplified neural networks. It was shown that these models exhibit an implicit tendency towards low matrix and tensor ranks, respectively. Drawing closer to practical deep learning, the current paper theoretically analyzes the implicit regularization in hierarchical tensor factorization, a model equivalent to certain deep convolutional neural networks. Through a dynamical systems lens, we overcome challenges associated with hierarchy, and establish implicit regularization towards low hierarchical tensor rank. This translates to an implicit regularization towards locality for the associated convolutional networks. Inspired by our theory, we design explicit regularization discouraging locality, and demonstrate its ability to improve the performance of modern convolutional networks on non-local tasks, in defiance of conventional wisdom by which architectural changes are needed. Our work highlights the potential of enhancing neural networks via theoretical analysis of their implicit regularization.(1)
引用
收藏
页数:41
相关论文
共 50 条
  • [1] Implicit Regularization in Deep Tensor Factorization
    Milanesi, Paolo
    Kadri, Hachem
    Ayache, Stephan
    Artieres, Thierry
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [2] Implicit Regularization with Polynomial Growth in Deep Tensor Factorization
    Hariz, Kais
    Kadri, Hachem
    Ayache, Stephane
    Moakher, Maher
    Artieres, Thierry
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [3] Hierarchical Group Sparse Regularization for Deep Convolutional Neural Networks
    Mitsuno, Kakeru
    Miyao, Junichi
    Kurita, Takio
    2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [4] Implicit Regularization in Tensor Factorization
    Razin, Noam
    Maman, Asaf
    Cohen, Nadav
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [5] Implicit Regularization in Deep Matrix Factorization
    Arora, Sanjeev
    Cohen, Nadav
    Hu, Wei
    Luo, Yuping
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [6] Filter Pruning using Hierarchical Group Sparse Regularization for Deep Convolutional Neural Networks
    Mitsuno, Kakeru
    Kurita, Takio
    2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 1089 - 1095
  • [7] The Weights Reset Technique for Deep Neural Networks Implicit Regularization
    Plusch, Grigoriy
    Arsenyev-Obraztsov, Sergey
    Kochueva, Olga
    COMPUTATION, 2023, 11 (08)
  • [8] Deep Matrix Factorization Based on Convolutional Neural Networks for Image Inpainting
    Ma, Xiaoxuan
    Li, Zhiwen
    Wang, Hengyou
    ENTROPY, 2022, 24 (10)
  • [9] A novel companion objective function for regularization of deep convolutional neural networks
    Sun, Weichen
    Su, Fei
    IMAGE AND VISION COMPUTING, 2017, 60 : 58 - 63
  • [10] SWAP-NODE: A REGULARIZATION APPROACH FOR DEEP CONVOLUTIONAL NEURAL NETWORKS
    Yamashita, Takayoshi
    Tanaka, Masayuki
    Yamauchi, Yuji
    Fujiyoshi, Hironobu
    2015 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2015, : 2475 - 2479