Generalized Constraint Neural Network Regression Model Subject to Linear Priors

被引:35
作者
Qu, Ya-Jun [1 ]
Hu, Bao-Gang [1 ]
机构
[1] Chinese Acad Sci, Natl Lab Pattern Recognit, Inst Automat, Beijing 100190, Peoples R China
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2011年 / 22卷 / 12期
关键词
Linear constraints; linear priors; nonlinear regression; radial basis function networks; transparency; SUPPORT VECTOR MACHINES; INCORPORATING PRIOR KNOWLEDGE; EXTRACTION;
D O I
10.1109/TNN.2011.2167348
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper is reports an extension of our previous investigations on adding transparency to neural networks. We focus on a class of linear priors (LPs), such as symmetry, ranking list, boundary, monotonicity, etc., which represent either linear-equality or linear-inequality priors. A generalized constraint neural network-LPs (GCNN-LPs) model is studied. Unlike other existing modeling approaches, the GCNN-LP model exhibits its advantages. First, any LP is embedded by an explicitly structural mode, which may add a higher degree of transparency than using a pure algorithm mode. Second, a direct elimination and least squares approach is adopted to study the model, which produces better performances in both accuracy and computational cost over the Lagrange multiplier techniques in experiments. Specific attention is paid to both "hard (strictly satisfied)" and "soft (weakly satisfied)" constraints for regression problems. Numerical investigations are made on synthetic examples as well as on the real-world datasets. Simulation results demonstrate the effectiveness of the proposed modeling approach in comparison with other existing approaches.
引用
收藏
页码:2447 / 2459
页数:13
相关论文
共 50 条
[1]  
Abu-Mostafa Y., 1992, ADV NEURAL INFORMATI, V5, P73
[2]  
[Anonymous], 1980, The need for biases in learning generalizations
[3]  
[Anonymous], 2001, Pattern Classification
[4]  
[Anonymous], 1999, Learning in Graphical Models
[5]  
Bertsekas D. P., 1996, Constrained Optimization and Lagrange Multiplier Methods, V1
[6]  
Bi J., 2003, Journal of Machine Learning Research, V3, P1229, DOI 10.1162/153244303322753643
[7]   A Growing and Pruning Method for Radial Basis Function Networks [J].
Bortman, M. ;
Aladjem, M. .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2009, 20 (06) :1039-1045
[8]  
Chen S., 2009, GREY BOX RADIAL BASI
[9]   Monotone and Partially Monotone Neural Networks [J].
Daniels, Hennie ;
Velikova, Marina .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2010, 21 (06) :906-917
[10]  
Dugas C, 2009, J MACH LEARN RES, V10, P1239