In this paper we present a new way of structuring standard classes of NN, so obtaining a new class of parametric functions, which will be named 'Hierarchically-Structured-Neural-Networks' (HSNNs). HSNNs are a special class of networks, constituted by two sub-networks: the 'slave' unit and the 'master' unit; the master network is fed by a subset of inputs and its outputs are used to 'drive' the parameters of the slave network, whose inputs are disjoint from those of the other sub-network. After the general definition of HSNN has been given, two simple classes of HSNNs are presented and dedicated Back-Propagation algorithms are derived. The HSNNs are a useful tool when some prior knowledge of the nonlinear function to be approximated or designed is available; this is illustrated by means of five examples, where a variety of simple problems are discussed. (C) 1998 The Franklin Institute. Published by Elsevier Science Ltd.