Iterative Pre-Conditioning to Expedite the Gradient-Descent Method

被引:0
|
作者
Chakrabarti, Kushal [1 ]
Gupta, Nirupam [2 ]
Chopra, Nikhil [1 ]
机构
[1] Univ Maryland, College Pk, MD 20742 USA
[2] Georgetown Univ, Washington, DC 20057 USA
关键词
LINEAR-SYSTEMS;
D O I
10.23919/acc45564.2020.9147603
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper considers the problem of multi-agent distributed optimization. In this problem, there are multiple agents in the system, and each agent only knows its local cost function. The objective for the agents is to collectively compute a common minimum of the aggregate of all their local cost functions. In principle, this problem is solvable using a distributed variant of the traditional gradient-descent method, which is an iterative method. However, the speed of convergence of the traditional gradient-descent method is highly influenced by the conditioning of the optimization problem being solved. Specifically, the method requires a large number of iterations to converge to a solution if the optimization problem is ill-conditioned. In this paper, we propose an iterative pre-conditioning approach that can significantly attenuate the influence of the problem's conditioning on the convergence-speed of the gradient-descent method. The proposed pre-conditioning approach can be easily implemented in distributed systems and has minimal computation and communication overhead. For now, we only consider a specific distributed optimization problem wherein the individual local cost functions of the agents are quadratic. Besides the theoretical guarantees, the improved convergence speed of our approach is demonstrated through experiments on a real data-set.
引用
收藏
页码:3977 / 3982
页数:6
相关论文
共 50 条
  • [1] Iterative pre-conditioning for expediting the distributed gradient-descent method: The case of linear least-squares problem
    Chakrabarti, Kushal
    Gupta, Nirupam
    Chopra, Nikhil
    AUTOMATICA, 2022, 137
  • [2] A Gradient-Descent Method for Curve Fitting on Riemannian Manifolds
    Samir, Chafik
    Absil, P. -A.
    Srivastava, Anuj
    Klassen, Eric
    FOUNDATIONS OF COMPUTATIONAL MATHEMATICS, 2012, 12 (01) : 49 - 73
  • [3] A Gradient-Descent Method for Curve Fitting on Riemannian Manifolds
    Chafik Samir
    P.-A. Absil
    Anuj Srivastava
    Eric Klassen
    Foundations of Computational Mathematics, 2012, 12 : 49 - 73
  • [4] A gradient-descent iterative learning control algorithm for a non-linear system
    He, Zhiying
    Pu, Hongji
    TRANSACTIONS OF THE INSTITUTE OF MEASUREMENT AND CONTROL, 2025, 47 (02) : 342 - 351
  • [5] Robustness of Iteratively Pre-Conditioned Gradient-Descent Method: The Case of Distributed Linear Regression Problem
    Chakrabarti, Kushal
    Gupta, Nirupam
    Chopra, Nikhil
    IEEE CONTROL SYSTEMS LETTERS, 2021, 5 (06): : 2180 - 2185
  • [6] Learning for hierarchical fuzzy systems based on the gradient-descent method
    Wang, Di
    Zeng, Xiao-Jun
    Keane, John A.
    2006 IEEE INTERNATIONAL CONFERENCE ON FUZZY SYSTEMS, VOLS 1-5, 2006, : 92 - +
  • [7] Robustness of Iteratively Pre-Conditioned Gradient-Descent Method: The Case of Distributed Linear Regression Problem
    Chakrabarti, Kushal
    Gupta, Nirupam
    Chopra, Nikhil
    2021 AMERICAN CONTROL CONFERENCE (ACC), 2021, : 2248 - 2253
  • [8] Practical Gradient-Descent for Memristive Crossbars
    Nair, Manu V.
    Dudek, Piotr
    2015 INTERNATIONAL CONFERENCE ON MEMRISTIVE SYSTEMS (MEMRISYS), 2015,
  • [9] ON PRE-CONDITIONING OF MATRICES
    OSBORNE, EE
    JOURNAL OF THE ACM, 1960, 7 (04) : 338 - 345
  • [10] Sensory pre-conditioning
    Brogden, WJ
    JOURNAL OF EXPERIMENTAL PSYCHOLOGY, 1939, 25 (04): : 323 - 332