Design of Recurrent Neural Networks for Solving Constrained Least Absolute Deviation Problems

被引:27
作者
Hu, Xiaolin [1 ,2 ]
Sun, Changyin [3 ]
Zhang, Bo [1 ,2 ]
机构
[1] Tsinghua Univ, State Key Lab Intelligent Technol & Syst, TNList, Beijing 100084, Peoples R China
[2] Tsinghua Univ, Dept Comp Sci & Technol, Beijing 100084, Peoples R China
[3] Southeast Univ, Sch Automat, Nanjing 210096, Peoples R China
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2010年 / 21卷 / 07期
基金
中国国家自然科学基金;
关键词
L-1-norm optimization; least absolute deviation (LAD); minimax optimization; recurrent neural network (RNN); stability analysis; QUADRATIC-PROGRAMMING PROBLEMS; VARIATIONAL-INEQUALITIES;
D O I
10.1109/TNN.2010.2048123
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recurrent neural networks for solving constrained least absolute deviation (LAD) problems or L-1-norm optimization problems have attracted much interest in recent years. But so far most neural networks can only deal with some special linear constraints efficiently. In this paper, two neural networks are proposed for solving LAD problems with various linear constraints including equality, two-sided inequality and bound constraints. When tailored to solve some special cases of LAD problems in which not all types of constraints are present, the two networks can yield simpler architectures than most existing ones in the literature. In particular, for solving problems with both equality and one-sided inequality constraints, another network is invented. All of the networks proposed in this paper are rigorously shown to be capable of solving the corresponding problems. The different networks designed for solving the same types of problems possess the same structural complexity, which is due to the fact these architectures share the same computing blocks and only differ in connections between some blocks. By this means, some flexibility for circuits realization is provided. Numerical simulations are carried out to illustrate the theoretical results and compare the convergence rates of the networks.
引用
收藏
页码:1073 / 1086
页数:14
相关论文
共 38 条
[1]  
[Anonymous], 1982, ORDINARY DIFFERENTIA
[2]  
[Anonymous], ADAPTIVE FILTER THEO, DOI DOI 10.1109/ISCAS.2017.8050871
[3]  
[Anonymous], 1993, Neural networks for optimization and signal processing
[4]   A Nonfeasible Gradient Projection Recurrent Neural Network for Equality-Constrained Optimization Problems [J].
Barbarosou, Maria P. ;
Maratos, Nicholas G. .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2008, 19 (10) :1665-1677
[5]  
Boyd S., 2004, CONVEX OPTIMIZATION, VFirst, DOI DOI 10.1017/CBO9780511804441
[6]   A neutral-type delayed projection neural network for solving nonlinear variational inequalities [J].
Cheng, Long ;
Hou, Zeng-Guang ;
Tan, Min .
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-EXPRESS BRIEFS, 2008, 55 (08) :806-810
[7]   Quasi-Lagrangian Neural Network for Convex Quadratic Optimization [J].
Costantini, Giovanni ;
Perfetti, Renzo ;
Todisco, Massimiliano .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2008, 19 (10) :1804-1809
[8]  
Dodge Y., 2000, ADAPTIVE REGRESSION
[9]   Generalized neural network, for nonsmooth nonlinear programming problems [J].
Forti, M ;
Nistri, P ;
Quincampoix, M .
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I-REGULAR PAPERS, 2004, 51 (09) :1741-1754
[10]  
GALL AL, 1999, IEEE T NEURAL NETWOR, V10, P72