Exploiting the Error Resilience of the Preconditioned Conjugate Gradient Method for Energy and Delay Optimization

被引:0
|
作者
Lylina, Natalia [1 ]
Holst, Stefan [2 ]
Jafarzadeh, Hanieh [1 ]
Kourfali, Alexandra [1 ]
Wunderlich, Hans-Joachim [3 ]
机构
[1] Univ Stuttgart, ITI, Pfaffenwaldring 47, D-70569 Stuttgart, Germany
[2] Kyushu Inst Technol, Dept Creat Informat, Kitakyushu, Fukuoka, Japan
[3] Univ Stuttgart, Pfaffenwaldring 47, D-70569 Stuttgart, Germany
来源
2023 IEEE 29TH INTERNATIONAL SYMPOSIUM ON ON-LINE TESTING AND ROBUST SYSTEM DESIGN, IOLTS | 2023年
关键词
Preconditioned Conjugate Gradient; overscaling; energy optimization; hardware accelerators; DESIGN;
D O I
10.1109/IOLTS59296.2023.10224885
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
The Preconditioned Conjugate Gradient (PCG) method is well-established for solving linear equations. Running the PCG method on a hardware accelerator ensures fast and efficient computation. At the same time, each hardware accelerator may be slightly different due to process variability or aging. To handle the variability, a rather pessimistic frequency selection for the whole population of accelerators is often utilized. Increasing the frequency may improve the performance but may also increase the risk of computational errors, affect the convergence of PCG or even corrupt the PCG results. In this paper, we present a method to determine the frequency for each hardware accelerator instance which optimizes the execution time and the energy efficiency of the PCG method. First, a technique is presented to analyze the error resilience of a PCG algorithm to overclocking. Based on the analysis results, we increase the frequency to speed up the convergence while keeping the error rate below the required threshold.
引用
收藏
页数:7
相关论文
共 50 条
  • [1] Stochastic optimization using the stochastic preconditioned conjugate gradient method
    Oakley, DR
    Sues, RH
    AIAA JOURNAL, 1996, 34 (09) : 1969 - 1971
  • [2] Preconditioned nonlinear conjugate gradient method for micromagnetic energy minimization
    Exl, Lukas
    Fischbacher, Johann
    Kovacs, Alexander
    Oezelt, Harald
    Gusenbauer, Markus
    Schrefl, Thomas
    COMPUTER PHYSICS COMMUNICATIONS, 2019, 235 : 179 - 186
  • [3] Preconditioned conjugate gradient method on the hypercube
    Abe, G.
    Hane, K.
    Conference on Hypercube Concurrent Computers and Applications, 1988,
  • [4] SAOR preconditioned conjugate gradient method
    Wang, Jianguo
    Meng, Guoyan
    IITA 2007: WORKSHOP ON INTELLIGENT INFORMATION TECHNOLOGY APPLICATION, PROCEEDINGS, 2007, : 331 - 334
  • [5] AOR Preconditioned Conjugate Gradient Method
    Wang, Jianguo
    Zhao, Qingshan
    ADVANCES IN MATRIX THEORY AND ITS APPLICATIONS, VOL 1: PROCEEDINGS OF THE EIGHTH INTERNATIONAL CONFERENCE ON MATRIX THEORY AND ITS APPLICATIONS, 2008, : 258 - 261
  • [6] Guardband Optimization for the Preconditioned Conjugate Gradient Algorithm
    Lylina, Natalia
    Holst, Stefan
    Jafarzadeh, Hanieh
    Kourfali, Alexandra
    Wunderlich, Hans-Joachim
    2023 53RD ANNUAL IEEE/IFIP INTERNATIONAL CONFERENCE ON DEPENDABLE SYSTEMS AND NETWORKS WORKSHOPS, DSN-W, 2023, : 195 - 198
  • [7] Efficiency analysis on a truncated Newton method with preconditioned conjugate gradient technique for optimization
    Zhang, JZ
    Deng, NY
    Wang, ZZ
    HIGH PERFORMANCE ALGORITHMS AND SOFTWARE FOR NONLINEAR OPTIMIZATION, 2003, 82 : 383 - 416
  • [8] A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization
    Saman Babaie-Kafaki
    4OR, 2013, 11 : 361 - 374
  • [10] Efficient Parallelization of the Preconditioned Conjugate Gradient Method
    Accary, Gilbert
    Bessonov, Oleg
    Fougere, Dominique
    Gavrilov, Konstantin
    Meradji, Sofiane
    Morvan, Dominique
    PARALLEL COMPUTING TECHNOLOGIES, PROCEEDINGS, 2009, 5698 : 60 - +