Varying Infimum Gradient Descent Algorithm for Agent-Sever Systems Using Different Order Iterative Preconditioning Methods

被引:9
作者
Chen, Jing [1 ]
Wang, Dongqing [2 ]
Liu, Yanjun [1 ]
Zhu, Quanmin [3 ]
机构
[1] Jiangnan Univ, Sch Sci, Wuxi 214122, Jiangsu, Peoples R China
[2] Qingdao Univ, Coll Elect Engn, Qingdao 266071, Peoples R China
[3] Univ West England, Dept Engn Design & Math, Bristol BS16 1QY, Avon, England
基金
中国国家自然科学基金;
关键词
Agent-server system; convergence rate; gradient descent (GD) algorithm; preconditioning matrix; varying infimum; IDENTIFICATION; MODEL;
D O I
10.1109/TII.2021.3123304
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In the traditional gradient descent (T-GD) algorithm, the convergence rate is strongly depend on the condition number of the information matrix: a larger condition number leads to a poor optimal convergence factor infimum mu(op), which sets a convergence rate ceiling. That is, once the information matrix is fixed, the convergence factor of the T-GD algorithm reaches at most the infimum mu(op). This article studies a varying infimum gradient descent algorithm, which can move down the infimum by using different order iterative preconditioning methods, as follows: first, for infinite iterative algorithm, the infimum becomes smaller and smaller with the increased iteration numbers; second, for finite iterative algorithm, the infimum is equal to zero, and the parameter estimates can be obtained in only one iteration; third, construct an adaptive interval between zero and mu(op), which can establish a link between the least squares and T-GD algorithms. Based on the varying infimum gradient descent algorithm, researchers can adaptively choose preconditioning matrices for different kinds of models on a case by case basis. The convergence analysis and simulation examples show effectiveness of the proposed algorithms.
引用
收藏
页码:4436 / 4446
页数:11
相关论文
共 34 条
[1]   Conjugate gradient method for fuzzy symmetric positive definite system of linear equations [J].
Abbasbandy, S ;
Jafarian, A ;
Ezzati, R .
APPLIED MATHEMATICS AND COMPUTATION, 2005, 171 (02) :1184-1191
[2]   Fast Moving Horizon State Estimation for Discrete-Time Systems Using Single and Multi Iteration Descent Methods [J].
Alessandri, Angelo ;
Gaggero, Mauro .
IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2017, 62 (09) :4499-4511
[3]  
[Anonymous], 2003, Iterative Methods for Sparse Linear Systems, DOI DOI 10.1137/1.9780898718003
[4]   A preconditioning technique in conjunction with Krylov subspace methods for solving multilinear systems [J].
Beik, Fatemeh P. A. ;
Najafi-Kalyani, Mehdi .
APPLIED MATHEMATICS LETTERS, 2021, 116
[5]  
Chakrabarti K, 2020, P AMER CONTR CONF, P3977, DOI [10.23919/ACC45564.2020.9147603, 10.23919/acc45564.2020.9147603]
[6]   Identification of Two-Dimensional ausal Systems With Missing Output Data via Expectation-Maximization Algorithm [J].
Chen, Jing ;
Huang, Biao ;
Ding, Feng .
IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2021, 17 (08) :5185-5196
[7]   Gradient-Based Particle Filter Algorithm for an ARX Model With Nonlinear Communication Output [J].
Chen, Jing ;
Liu, Yanjun ;
Ding, Feng ;
Zhu, Quanmin .
IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS, 2020, 50 (06) :2198-2207
[8]   Improved gradient descent algorithms for time-delay rational state-space systems: intelligent search method and momentum method [J].
Chen, Jing ;
Zhu, Quanmin ;
Hu, Manfeng ;
Guo, Liuxiao ;
Narayan, Pritesh .
NONLINEAR DYNAMICS, 2020, 101 (01) :361-373
[9]   Modified Kalman filtering based multi-step-length gradient iterative algorithm for ARX models with random missing outputs [J].
Chen, Jing ;
Zhu, Quanmin ;
Liu, Yanjun .
AUTOMATICA, 2020, 118
[10]   Interval Error Correction Auxiliary Model Based Gradient Iterative Algorithms for Multirate ARX Models [J].
Chen, Jing ;
Ding, Feng ;
Zhu, Quanmin ;
Liu, Yanjun .
IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2020, 65 (10) :4385-4392