Deterministic convergence analysis via smoothing group Lasso regularization and adaptive momentum for Sigma-Pi-Sigma neural network

被引:30
|
作者
Kang, Qian [1 ]
Fan, Qinwei [1 ]
Zurada, Jacek M. [2 ,3 ]
机构
[1] Xian Polytech Univ, Sch Sci, Xian 710048, Peoples R China
[2] Univ Louisville, Dept Elect & Comp Engn, Louisville, KY 40292 USA
[3] Spoleczna Akad Nauk, PL-90011 Lodz, Poland
基金
中国博士后科学基金;
关键词
Sigma-Pi-Sigma neural network; Smoothing group lasso regularization; Adaptive momentum; Convergence; L-1/2; REGULARIZATION; LEARNING ALGORITHM; GRADIENT-METHOD; PENALTY; ONLINE;
D O I
10.1016/j.ins.2020.12.014
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper, we propose a sparse and accelerated method for Sigma-Pi-Sigma neural network training based on smoothing group lasso regularization and adaptive momentum. It is shown that group sparsity can more efficiently sparsity the network structure at the group level, and the adaptive momentum term can speed up the learning convergence during the iteration process. Also, another important contribution lies in the analysis of theoretical results. However, the group lasso regularization is not differentiable at the origin. This leads to oscillations observed in numerical experiments and poses a challenge to theoretical analysis. So, we overcome those problems by smoothing techniques. Under suitable assumptions, we strictly proved the monotonicity, and weak and strong convergence theorems of the new algorithm. Finally, the numerical experiments are presented to support our theoretical findings. (C) 2020 Published by Elsevier Inc.
引用
收藏
页码:66 / 82
页数:17
相关论文
共 16 条
  • [1] Convergence analysis for sigma-pi-sigma neural network based on some relaxed conditions
    Fan, Qinwei
    Kang, Qian
    Zurada, Jacek M.
    INFORMATION SCIENCES, 2022, 585 : 70 - 88
  • [2] Convergence of batch gradient learning algorithm with smoothing L1/2 regularization for Sigma-Pi-Sigma neural networks
    Liu, Yan
    Li, Zhengxue
    Yang, Dakun
    Mohamed, Kh. Sh.
    Wang, Jing
    Wu, Wei
    NEUROCOMPUTING, 2015, 151 : 333 - 341
  • [3] A Sigma-Pi-Sigma Neural Network (SPSNN)
    Chien-Kuo Li
    Neural Processing Letters, 2003, 17 : 1 - 19
  • [4] A sigma-pi-sigma neural network (SPSNN)
    Li, CK
    NEURAL PROCESSING LETTERS, 2003, 17 (01) : 1 - 19
  • [5] A new Sigma-Pi-Sigma neural network based on L1 and L2 regularization and applications
    Jiao, Jianwei
    Su, Keqin
    AIMS MATHEMATICS, 2024, 9 (03): : 5995 - 6012
  • [6] Convergence of batch gradient learning with smoothing regularization and adaptive momentum for neural networks
    Fan, Qinwei
    Wu, Wei
    Zurada, Jacek M.
    SPRINGERPLUS, 2016, 5
  • [7] Relaxed conditions for convergence analysis of online back-propagation algorithm with L2 regularizer for Sigma-Pi-Sigma neural network
    Liu, Yan
    Yang, Dakun
    Zhang, Chao
    NEUROCOMPUTING, 2018, 272 : 163 - 169
  • [8] Convergence of Batch Gradient Method for Training of Pi-Sigma Neural Network with Regularizer and Adaptive Momentum Term
    Fan, Qinwei
    Liu, Le
    Kang, Qian
    Zhou, Li
    NEURAL PROCESSING LETTERS, 2023, 55 (04) : 4871 - 4888
  • [9] A pruning algorithm with relaxed conditions for high-order neural networks based on smoothing group L1/2 regularization and adaptive momentum
    Kang, Qian
    Fan, Qinwei
    Zurada, Jacek M.
    Huang, Tingwen
    KNOWLEDGE-BASED SYSTEMS, 2022, 257
  • [10] Convergence of Batch Gradient Method for Training of Pi-Sigma Neural Network with Regularizer and Adaptive Momentum Term
    Qinwei Fan
    Le Liu
    Qian Kang
    Li Zhou
    Neural Processing Letters, 2023, 55 : 4871 - 4888