A modified gradient learning algorithm with smoothing L1/2 regularization for Takagi-Sugeno fuzzy models

被引:18
|
作者
Liu, Yan [1 ]
Wu, Wei [2 ]
Fan, Qinwei [2 ]
Yang, Dakun [2 ]
Wang, Jian [3 ]
机构
[1] Dalian Polytech Univ, Sch Informat Sci & Engn, Dalian 116034, Peoples R China
[2] Dalian Univ Technol, Sch Math Sci, Dalian 116024, Peoples R China
[3] China Univ Petr, Sch Math & Computat Sci, Dongying 257061, Peoples R China
基金
中国国家自然科学基金;
关键词
Takagi-Sugeno (T-S) fuzzy models; Gradient descent method; Convergence; Gaussian-type membership function; Variable selection; Regularizer; IDENTIFICATION; NETWORKS; SYSTEMS;
D O I
10.1016/j.neucom.2014.01.041
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A popular and feasible approach to determine the appropriate size of a neural network is to remove unnecessary connections from an oversized network. The advantage of L-1/2 regularization has been recognized for sparse modeling. However, the nonsmoothness of L-1/2 regularization may lead to oscillation phenomenon. An approach with smoothing L-1/2 regularization is proposed in this paper for Takagi-Sugeno (T-S) fuzzy models, in order to improve the learning efficiency and to promote sparsity of the models. The new smoothing L-1/2 regularizer removes the oscillation. Besides, it also enables us to prove the weak and strong convergence results for T-S fuzzy neural networks with zero-order. Furthermore, a relationship between the learning rate parameter and the penalty parameter is given to guarantee the convergence. Simulation results are provided to support the theoretical findings, and they show the superiority of the smoothing L-1/2 regularization over the original L-1/2 regularization. (C) 2014 Elsevier B.V. All rights reserved.
引用
收藏
页码:229 / 237
页数:9
相关论文
共 41 条
  • [1] Convergence analysis of the batch gradient-based neuro-fuzzy learning algorithm with smoothing L1/2 regularization for the first-order Takagi-Sugeno system
    Liu, Yan
    Yang, Dakun
    FUZZY SETS AND SYSTEMS, 2017, 319 : 28 - 49
  • [2] A Novel Learning Approach to Remove Oscillations in First-Order Takagi-Sugeno Fuzzy System: Gradient Descent-Based Neuro-Fuzzy Algorithm Using Smoothing Group Lasso Regularization
    Liu, Yan
    Wang, Rui
    Liu, Yuanquan
    Shao, Qiang
    Lv, Yan
    Yu, Yan
    ADVANCED THEORY AND SIMULATIONS, 2024, 7 (02)
  • [4] A Novel Neuro-fuzzy Learning Algorithm for First-Order Takagi-Sugeno Fuzzy Model: Caputo Fractional-Order Gradient Descent Method
    Liu, Yan
    Liu, Yuanquan
    Shao, Qiang
    Wang, Rui
    Lv, Yan
    INTERNATIONAL JOURNAL OF FUZZY SYSTEMS, 2024, 26 (08) : 2616 - 2631
  • [5] Pantograph Catenary Contact Force Regulation Based on Modified Takagi-Sugeno Fuzzy Models
    Van Hai, Nguyen
    Van Tiem, Nguyen
    Lan, Le Hung
    Ha, Vo Thanh
    ENGINEERING TECHNOLOGY & APPLIED SCIENCE RESEARCH, 2023, 13 (01) : 9879 - 9887
  • [6] The convergence analysis of SpikeProp algorithm with smoothing L1/2 regularization
    Zhao, Junhong
    Zurada, Jacek M.
    Yang, Jie
    Wu, Wei
    NEURAL NETWORKS, 2018, 103 : 19 - 28
  • [7] Convergence of batch gradient learning algorithm with smoothing L1/2 regularization for Sigma-Pi-Sigma neural networks
    Liu, Yan
    Li, Zhengxue
    Yang, Dakun
    Mohamed, Kh. Sh.
    Wang, Jing
    Wu, Wei
    NEUROCOMPUTING, 2015, 151 : 333 - 341
  • [8] A modified interval type-2 Takagi-Sugeno fuzzy neural network and its convergence analysis
    Gao, Tao
    Bai, Xiao
    Wang, Chen
    Zhang, Liang
    Zheng, Jin
    Wang, Jian
    PATTERN RECOGNITION, 2022, 131
  • [9] A Batch Variable Learning Rate Gradient Descent Algorithm With the Smoothing <italic>L</italic><sub>1/2</sub> Regularization for Takagi-Sugeno Models
    Lu, Yunlong
    Li, Wenyu
    Wang, Hongwei
    IEEE ACCESS, 2020, 8 : 100185 - 100193
  • [10] A New Conjugate Gradient Method with Smoothing L1/2 Regularization Based on a Modified Secant Equation for Training Neural Networks
    Li, Wenyu
    Liu, Yan
    Yang, Jie
    Wu, Wei
    NEURAL PROCESSING LETTERS, 2018, 48 (02) : 955 - 978