Convergence analysis of batch gradient algorithm for three classes of sigma-pi neural networks

被引:17
作者
Zhang, Chao [1 ]
Wu, Wei [1 ]
Xiong, Yan [1 ]
机构
[1] Dalian Univ Technol, Dept Appl Math, Dalian 116024, Peoples R China
基金
中国国家自然科学基金;
关键词
convergence; sigma-Pi-Sigma neural networks; sigma-Sigma-Pi neural networks; sigma-Pi-Sigma-Pi neural networks; batch gradient algorithm; monotonicity;
D O I
10.1007/s11063-007-9050-0
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Sigma-Pi (Sigma u-Sigma a) neural networks (SPNNs) are known to provide more powerful mapping capability than traditional feed-forward neural networks. A unified convergence analysis for the batch gradient algorithm for SPNN learning is presented, covering three classes of SPNNs: Sigma u-Sigma a-Sigma u, Sigma u-Sigma u-Sigma a and Sigma u-Sigma a-Sigma u-Sigma a. The monotonicity of the error function in the iteration is also guaranteed.
引用
收藏
页码:177 / 189
页数:13
相关论文
共 13 条