A generalized-constraint neural network model: Associating partially known relationships for nonlinear regressions

被引:35
作者
Hu, Bao-Gang [1 ,2 ]
Qu, Han-Bing [3 ]
Wang, Yong [2 ]
Yang, Shuang-Hong [4 ]
机构
[1] Chinese Acad Sci, Inst Automat, NLPR LIAMA, Beijing 100080, Peoples R China
[2] Chinese Acad Sci, Beijing Grad Sch, Beijing 100049, Peoples R China
[3] Beijing Acad Sci & Technol, Beijing Pattern Recognit Tech Res Ctr, Beijing, Peoples R China
[4] Georgia Inst Technol, Coll Comp, Atlanta, GA 30332 USA
关键词
Nonlinear approximation; Prior knowledge; Constraints; Parameter identifiability; Black box; PRIOR KNOWLEDGE; INFORMATION;
D O I
10.1016/j.ins.2009.02.006
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In an attempt to enhance the neural network technique so that it can evolve from a "black box" tool into a semi-analytical one, we propose a novel modeling approach of imposing "generalized constraints" on a standard neural network. We redefine approximation problems by use of a new formalization with the aim of embedding prior knowledge explicitly into the model to the maximum extent. A generalized-constraint neural network (GCNN) model has therefore been developed, which basically consists of two submodels. One is constructed by the standard neural network technique to approximate the unknown part of the target function. The other is formed from partially known relationships to impose generalized constraints on the whole model. Three issues arising after combination of the two submodels are discussed: (a) the better approximation provided by the GCNN model compared with a standard neural network, (b) the identifiability of parameters in the partially known relationships, and (c) the discrepancy in the approximation due to removable singularities in the target function. Numerical studies of three benchmark problems show important findings that have not previously been reported in the literature. Significant benefits were observed from using the GCNN model in comparison with a standard neural network. (c) 2009 Elsevier Inc. All rights reserved.
引用
收藏
页码:1929 / 1943
页数:15
相关论文
共 38 条
[1]  
ABUMOSTAFA Y, 1993, ADV NEURAL INFORMATI, V5, P73
[2]  
ANDREWS R, 1999, P INT C NEUR INF PRO, P251
[3]  
[Anonymous], 2000, A Course in Approximation Theory
[4]  
[Anonymous], 1996, EXPLANATION BASED NE
[5]  
[Anonymous], 1997, MACHINE LEARNING, MCGRAW-HILL SCIENCE/ENGINEERING/MATH
[6]   INVARIANCE AND NEURAL NETS [J].
BARNARD, E ;
CASASENT, D .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1991, 2 (05) :498-508
[7]   A method for approximating one-dimensional functions [J].
Basios, V ;
Bonushkina, AY ;
Ivanov, VV .
COMPUTERS & MATHEMATICS WITH APPLICATIONS, 1997, 34 (7-8) :687-693
[8]   MINDFUL: A framework for Meta-INDuctive neuro-FUzzy learning [J].
Castiello, Ciro ;
Castellano, Giovanna ;
Fanelli, Anna Maria .
INFORMATION SCIENCES, 2008, 178 (16) :3253-3274
[9]  
Castro R, 2008, J INHERIT METAB DIS, V31, P10
[10]   Detecting parameter redundancy [J].
Catchpole, EA ;
Morgan, BJT .
BIOMETRIKA, 1997, 84 (01) :187-196