Generalization of stochastic-resonance-based threshold networks with Tikhonov regularization

被引:15
作者
Bai, Saiya [1 ]
Duan, Fabing [1 ]
Chapeau-Blondeau, Francois [2 ]
Abbott, Derek [3 ,4 ]
机构
[1] Qingdao Univ, Inst Complex Sci, Coll Automat, Qingdao 266071, Peoples R China
[2] Univ Angers, Lab Angevin Rech Ingn Syst, 62 Ave Notre Dame Lac, F-49000 Angers, France
[3] Univ Adelaide, Ctr Biomed Engn, Adelaide, SA 5005, Australia
[4] Univ Adelaide, Sch Elect & Elect Engn, Adelaide, SA 5005, Australia
基金
澳大利亚研究理事会;
关键词
FEEDFORWARD NEURAL-NETWORKS; NOISE INJECTION;
D O I
10.1103/PhysRevE.106.L012101
中图分类号
O35 [流体力学]; O53 [等离子体物理学];
学科分类号
070204 ; 080103 ; 080704 ;
摘要
Injecting artificial noise into a feedforward threshold neural network allows it to become trainable by gradientbased methods and also enlarges the parameter space as well as the range of synaptic weights. This configuration constitutes a stochastic-resonance-based threshold neural network, where the noise level can adaptively converge to a nonzero optimal value for finding a local minimum of the loss criterion. We prove theoretically that the injected noise plays the role of a generalized Tikhonov regularizer for training the designed threshold network. Experiments on regression and classification problems demonstrate that the generalization of the stochasticresonance-based threshold network is improved by the injection of noise. The feasibility of injecting noise into the threshold neural network opens up the potential for adaptive stochastic resonance in machine learning.
引用
收藏
页数:5
相关论文
empty
未找到相关数据