Hierarchically structured neural networks: a way to shape a 'magma' of neurons

被引:3
作者
Bittanti, S [1 ]
Savaresi, SM [1 ]
机构
[1] Politecn Milan, Dipartimento Elettr & Informat, I-20133 Milan, Italy
来源
JOURNAL OF THE FRANKLIN INSTITUTE-ENGINEERING AND APPLIED MATHEMATICS | 1998年 / 335B卷 / 05期
关键词
neural-network; hierarchical structure; system modelling; control systems;
D O I
10.1016/S0016-0032(97)00024-0
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper we present a new way of structuring standard classes of NN, so obtaining a new class of parametric functions, which will be named 'Hierarchically-Structured-Neural-Networks' (HSNNs). HSNNs are a special class of networks, constituted by two sub-networks: the 'slave' unit and the 'master' unit; the master network is fed by a subset of inputs and its outputs are used to 'drive' the parameters of the slave network, whose inputs are disjoint from those of the other sub-network. After the general definition of HSNN has been given, two simple classes of HSNNs are presented and dedicated Back-Propagation algorithms are derived. The HSNNs are a useful tool when some prior knowledge of the nonlinear function to be approximated or designed is available; this is illustrated by means of five examples, where a variety of simple problems are discussed. (C) 1998 The Franklin Institute. Published by Elsevier Science Ltd.
引用
收藏
页码:929 / 950
页数:22
相关论文
共 22 条