CONVERGENCE BOUNDS FOR COMPRESSED GRADIENT METHODS WITH MEMORY BASED ERROR COMPENSATION

被引:0
|
作者
Khirirat, Sarit [1 ]
Magnusson, Sindri [2 ]
Johansson, Mikael [1 ]
机构
[1] Royal Inst Technol KTH, Sch Elect Engn & Comp Sci, Div Decis & Control Syst, Stockholm, Sweden
[2] Harvard Univ, Sch Engn & Appl Sci, 33 Oxford St, Cambridge, MA 02138 USA
来源
2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP) | 2019年
关键词
Quadratic optimization; quantization; gradient descent; incremental gradient methods;
D O I
10.1109/icassp.2019.8682931
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
The veritable scale of modern data necessitates information compression in parallel/distributed big-data optimization. Compression schemes using memory-based error compensation have displayed superior performance in practice, however, to date there are no theoretical explanations for these observed advantages. This paper provides the first theoretical support for why such compression schemes yields higher accuracy solutions in optimization. Our results cover both gradient and incremental gradient algorithms for quadratic optimization. Unlike previous works, our theoretical results explicitly quantify the accuracy gains from error compensation, especially for ill-conditioned problems. Finally, the numerical results on linear least-squares problems validate the benefit of error compensation and demonstrate tightness of our convergence guarantees.
引用
收藏
页码:2857 / 2861
页数:5
相关论文
共 50 条
  • [1] Compressed Gradient Methods With Hessian-Aided Error Compensation
    Khirirat, Sarit
    Magnusson, Sindri
    Johansson, Mikael
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2021, 69 : 998 - 1011
  • [2] Convergence of memory gradient methods
    Shi, Zhen-Jun
    Guo, Jinhua
    INTERNATIONAL JOURNAL OF COMPUTER MATHEMATICS, 2008, 85 (07) : 1039 - 1053
  • [3] Error Bounds, Quadratic Growth, and Linear Convergence of Proximal Methods
    Drusvyatskiy, Dmitriy
    Lewis, Adrian S.
    MATHEMATICS OF OPERATIONS RESEARCH, 2018, 43 (03) : 919 - 948
  • [4] Theoretical error bounds on the convergence of the Lanczos and block-Lanczos methods
    Yang, TR
    COMPUTERS & MATHEMATICS WITH APPLICATIONS, 1999, 38 (9-10) : 19 - 38
  • [5] ERROR AND CONVERGENCE BOUNDS FOR BORN EXPANSION
    MANNING, I
    PHYSICAL REVIEW, 1965, 139 (2B): : B495 - &
  • [6] On the Rate of Convergence and Error Bounds for LSTD(λ)
    Tagorti, Manel
    Scherrer, Bruno
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 37, 2015, 37 : 1521 - 1529
  • [7] Level-Set Subdifferential Error Bounds and Linear Convergence of Bregman Proximal Gradient Method
    Zhu, Daoli
    Deng, Sien
    Li, Minghua
    Zhao, Lei
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2021, 189 (03) : 889 - 918
  • [8] Level-Set Subdifferential Error Bounds and Linear Convergence of Bregman Proximal Gradient Method
    Daoli Zhu
    Sien Deng
    Minghua Li
    Lei Zhao
    Journal of Optimization Theory and Applications, 2021, 189 : 889 - 918
  • [9] AN ERROR ANALYSIS OF GRADIENT-BASED METHODS
    LEE, JH
    KIM, SD
    SIGNAL PROCESSING, 1994, 35 (02) : 157 - 162
  • [10] Global convergence of the three-term memory gradient methods with perturbations
    Li, Meixia
    Wang, Changyu
    DYNAMICS OF CONTINUOUS DISCRETE AND IMPULSIVE SYSTEMS-SERIES B-APPLICATIONS & ALGORITHMS, 2006, 13 : 161 - 164