共 49 条
- [42] A Smoothing Algorithm with Constant Learning Rate for Training Two Kinds of Fuzzy Neural Networks and Its Convergence Neural Processing Letters, 2020, 51 : 1093 - 1109
- [43] RECURRENT NEURAL NETWORK WITH L1/2 REGULARIZATION FOR REGRESSION AND MULTICLASS CLASSIFICATION PROBLEMS JOURNAL OF NONLINEAR FUNCTIONAL ANALYSIS, 2022, 2022
- [47] Convergence of batch gradient algorithm with smoothing composition of group l0\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$l_{0}$$\end{document} and l1/2\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$l_{1/2}$$\end{document} regularization for feedforward neural networks Progress in Artificial Intelligence, 2022, 11 (3) : 269 - 278
- [48] Batch Gradient Training Method with Smoothing Group L0\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$L_0$$\end{document} Regularization for Feedfoward Neural Networks Neural Processing Letters, 2023, 55 (2) : 1663 - 1679
- [49] Batch gradient training method with smoothing ℓ0\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\boldsymbol{\ell}_{\bf 0}$$\end{document} regularization for feedforward neural networks Neural Computing and Applications, 2015, 26 (2) : 383 - 390