Nonlinear system identification using neural networks trained with natural gradient descent

被引:0
作者
机构
[1] Ibnkahla, Mohamed
来源
Ibnkahla, M. (mohamed.ibnkahla@ece.queensu.ca) | 1600年 / Hindawi Publishing Corporation卷 / 2003期
关键词
D O I
暂无
中图分类号
学科分类号
摘要
21
引用
收藏
相关论文
共 50 条
  • [41] Calibrated Stochastic Gradient Descent for Convolutional Neural Networks
    Zhuo, Li'an
    Zhang, Baochang
    Chen, Chen
    Ye, Qixiang
    Liu, Jianzhuang
    Doermann, David
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 9348 - 9355
  • [42] Gradient descent learning for quaternionic Hopfield neural networks
    Kobayashi, Masaki
    NEUROCOMPUTING, 2017, 260 : 174 - 179
  • [43] A gradient descent learning algorithm for fuzzy neural networks
    Feuring, T
    Buckley, JJ
    Hayashi, Y
    1998 IEEE INTERNATIONAL CONFERENCE ON FUZZY SYSTEMS AT THE IEEE WORLD CONGRESS ON COMPUTATIONAL INTELLIGENCE - PROCEEDINGS, VOL 1-2, 1998, : 1136 - 1141
  • [44] Convergence of gradient descent for learning linear neural networks
    Nguegnang, Gabin Maxime
    Rauhut, Holger
    Terstiege, Ulrich
    ADVANCES IN CONTINUOUS AND DISCRETE MODELS, 2024, 2024 (01):
  • [45] Generalization Guarantees of Gradient Descent for Shallow Neural Networks
    Wang, Puyu
    Lei, Yunwen
    Wang, Di
    Ying, Yiming
    Zhou, Ding-Xuan
    NEURAL COMPUTATION, 2025, 37 (02) : 344 - 402
  • [46] Fractional Gradient Descent Method for Spiking Neural Networks
    Yang, Honggang
    Chen, Jiejie
    Jiang, Ping
    Xu, Mengfei
    Zhao, Haiming
    2023 2ND CONFERENCE ON FULLY ACTUATED SYSTEM THEORY AND APPLICATIONS, CFASTA, 2023, : 636 - 641
  • [47] Understanding the Convolutional Neural Networks with Gradient Descent and Backpropagation
    Zhou, XueFei
    2ND INTERNATIONAL CONFERENCE ON MACHINE VISION AND INFORMATION TECHNOLOGY (CMVIT 2018), 2018, 1004
  • [48] Neural Networks can Learn Representations with Gradient Descent
    Damian, Alex
    Lee, Jason D.
    Soltanolkotabi, Mahdi
    CONFERENCE ON LEARNING THEORY, VOL 178, 2022, 178
  • [49] Understanding approximate Fisher information for fast convergence of natural gradient descent in wide neural networks*
    Karakida, Ryo
    Osawa, Kazuki
    JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT, 2021, 2021 (12):
  • [50] Understanding Approximate Fisher Information for Fast Convergence of Natural Gradient Descent in Wide Neural Networks
    Karakida, Ryo
    Osawa, Kazuki
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33