The Weights Reset Technique for Deep Neural Networks Implicit Regularization

被引:0
|
作者
Plusch, Grigoriy [1 ]
Arsenyev-Obraztsov, Sergey [1 ]
Kochueva, Olga [1 ]
机构
[1] Natl Univ Oil & Gas Gubkin Univ, Dept Appl Math & Comp Modeling, 65 Leninsky Prospekt, Moscow 119991, Russia
关键词
machine learning; deep learning; implicit regularization; computer vision; REPRESENTATIONS;
D O I
10.3390/computation11080148
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
We present a new regularization method called Weights Reset, which includes periodically resetting a random portion of layer weights during the training process using predefined probability distributions. This technique was applied and tested on several popular classification datasets, Caltech-101, CIFAR-100 and Imagenette. We compare these results with other traditional regularization methods. The subsequent test results demonstrate that the Weights Reset method is competitive, achieving the best performance on Imagenette dataset and the challenging and unbalanced Caltech-101 dataset. This method also has sufficient potential to prevent vanishing and exploding gradients. However, this analysis is of a brief nature. Further comprehensive studies are needed in order to gain a deep understanding of the computing potential and limitations of the Weights Reset method. The observed results show that the Weights Reset method can be estimated as an effective extension of the traditional regularization methods and can help to improve model performance and generalization.
引用
收藏
页数:16
相关论文
共 50 条
  • [41] Landscape Classification with Deep Neural Networks
    Buscombe, Daniel
    Ritchie, Andrew C.
    GEOSCIENCES, 2018, 8 (07)
  • [42] A survey on the applications of Deep Neural Networks
    Latha, R. S.
    Sreekanth, G. R. R.
    Suganthe, R. C.
    Selvaraj, R. Esakki
    2021 INTERNATIONAL CONFERENCE ON COMPUTER COMMUNICATION AND INFORMATICS (ICCCI), 2021,
  • [43] Deep learning in spiking neural networks
    Tavanaei, Amirhossein
    Ghodrati, Masoud
    Kheradpisheh, Saeed Reza
    Masquelier, Timothee
    Maida, Anthony
    NEURAL NETWORKS, 2019, 111 : 47 - 63
  • [44] Riemannian Curvature of Deep Neural Networks
    Kaul, Piyush
    Lall, Brejesh
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (04) : 1410 - 1416
  • [45] Multi-objective evolutionary architectural pruning of deep convolutional neural networks with weights inheritance
    Chung, K. T.
    Lee, C. K. M.
    Tsang, Y. P.
    Wu, C. H.
    Asadipour, Ali
    INFORMATION SCIENCES, 2024, 685
  • [46] Multiscale Conditional Regularization for Convolutional Neural Networks
    Lu, Yao
    Lu, Guangming
    Li, Jinxing
    Xu, Yuanrong
    Zhang, Zheng
    Zhang, David
    IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (01) : 444 - 458
  • [47] The Coupling Effect of Lipschitz Regularization in Neural Networks
    Couellan N.
    SN Computer Science, 2021, 2 (2)
  • [48] Learning Neural Networks without Lazy Weights
    Lee, Dong-gi
    Cho, Junhee
    Kim, Myungjun
    Park, Sunghong
    Shin, Hyunjung
    2022 IEEE INTERNATIONAL CONFERENCE ON BIG DATA AND SMART COMPUTING (IEEE BIGCOMP 2022), 2022, : 82 - 87
  • [49] Improved Spectral Norm Regularization for Neural Networks
    Johansson, Anton
    Engsner, Niklas
    Strannegard, Claes
    Mostad, Petter
    MODELING DECISIONS FOR ARTIFICIAL INTELLIGENCE, MDAI 2023, 2023, 13890 : 181 - 201
  • [50] Implicit Solutions of the Electrical Impedance Tomography Inverse Problem in the Continuous Domain with Deep Neural Networks
    Strauss, Thilo
    Khan, Taufiquar
    ENTROPY, 2023, 25 (03)