Using multilayer perceptrons as receptive fields in the design of neural networks

被引:15
作者
Cimino, Mario G. C. A. [1 ]
Pedrycz, Witold [2 ,3 ]
Lazzerini, Beatrice [1 ]
Marcelloni, Francesco [1 ]
机构
[1] Univ Pisa, Dipartimento Ingn Informaz Elettron Informat Tele, I-56122 Pisa, Italy
[2] Univ Alberta, Dept Elect & Comp Engn, Edmonton, AB T6G 2G7, Canada
[3] Polish Acad Sci, Syst Res Inst, PL-01447 Warsaw, Poland
基金
加拿大自然科学与工程研究理事会;
关键词
Conditional clustering; Local modeling; Neural receptive fields; Radial basis function (RBF) networks; Referential neural networks; REGRESSION; ALGORITHM; MIXTURES;
D O I
10.1016/j.neucom.2008.10.014
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we propose a new neural network architecture based on a family of referential multilayer perceptrons (RMLPs)that play a role of generalized receptive fields. In contrast to "standard" radial basis function (RBF) neural networks, the proposed topology of the network offers a considerable level of flexibility as the resulting receptive fields are highly diversified and capable of adjusting themselves to the characteristics of the locally available experimental data. We discuss in detail a design strategy of the novel architecture that fully exploits the modeling capabilities of the contributing RMLPs. The strategy comprises three phases. In the first phase, we form a "blueprint" of the network by employing a specialized version of the commonly encountered fuzzy C-means (FCM) clustering algorithm, namely the conditional (context-based) FCM. In this phase our intent is to generate a collection of information granules (fuzzy sets) in the space of input and output variables, narrowed down to some certain contexts. In the second phase, based upon a global view at the structure, we refine the input-output relationships by engaging a collection of RMLPs where each RMLP is trained by using the subset of data associated with the corresponding context fuzzy set. During training each receptive field focuses on the characteristics of these locally available data and builds a nonlinear mapping in a referential mode. Finally, the connections of the receptive fields are optimized through global minimization of the linear aggregation unit located at the output layer of the overall architecture. We also include a series of numeric experiments involving synthetic and real-world data sets which provide a thorough comparative analysis with standard RBF neural networks. (c) 2008 Elsevier B.V. All rights reserved.
引用
收藏
页码:2536 / 2548
页数:13
相关论文
共 53 条
  • [31] Robust radial basis function neural networks
    Lee, CC
    Chung, PC
    Tsai, JR
    Chang, CI
    [J]. IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 1999, 29 (06): : 674 - 685
  • [32] An ART-based construction of RBF networks
    Lee, SJ
    Hou, CL
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 2002, 13 (06): : 1308 - 1321
  • [33] Levine M.W., 1991, FUNDAMENTALS SENSATI
  • [34] LLOYD SP, 1982, IEEE T INFORM THEORY, V28, P129, DOI 10.1109/TIT.1982.1056489
  • [35] MacQueen J., 1967, 5 BERK S MATH STAT P, P281
  • [36] Numerical solution of differential equations using multiquadric radial basis function networks
    Mai-Duy, N
    Tran-Cong, T
    [J]. NEURAL NETWORKS, 2001, 14 (02) : 185 - 199
  • [37] Fast Learning in Networks of Locally-Tuned Processing Units
    Moody, John
    Darken, Christian J.
    [J]. NEURAL COMPUTATION, 1989, 1 (02) : 281 - 294
  • [38] Small Depth Polynomial Size Neural Networks
    Obradovic, Zoran
    Yan, Peiyuan
    [J]. NEURAL COMPUTATION, 1990, 2 (04) : 402 - 404
  • [39] Orr MJL., 1996, Introduction to radial basis function networks
  • [40] Conditional fuzzy clustering in the design of radial basis function neural networks
    Pedrycz, W
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 1998, 9 (04): : 601 - 612