Normalized Gradient Descent for Variational Quantum Algorithms

被引:6
|
作者
Suzuki, Yudai [1 ]
Yano, Hiroshi [2 ]
Raymond, Rudy [3 ,4 ]
Yamamoto, Naoki [2 ,4 ]
机构
[1] Keio Univ, Dept Mech Engn, Kohoku Ku, Hiyoshi 3-14-1, Yokohama, Kanagawa 2238522, Japan
[2] Keio Univ, Dept Appl Phys & Physicoinformat, Kohoku Ku, Hiyoshi 3-14-1, Yokohama, Kanagawa 2238522, Japan
[3] IBM Japan Ltd, IBM Quantum, Chuo Ku, 19-21 Nihonbashi, Tokyo 1038510, Japan
[4] Keio Univ, Quantum Comp Ctr, Kohoku Ku, Hiyoshi 3-14-1, Yokohama, Kanagawa 2238522, Japan
来源
2021 IEEE INTERNATIONAL CONFERENCE ON QUANTUM COMPUTING AND ENGINEERING (QCE 2021) / QUANTUM WEEK 2021 | 2021年
关键词
Variational Quantum Algorithms; Optimization; Normalized Gradient Descent;
D O I
10.1109/QCE52317.2021.00015
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
Variational quantum algorithms (VQAs) are promising methods that leverage noisy quantum computers and classical computing techniques for practical applications. In VQAs, the classical optimizers such as gradient-based optimizers are utilized to adjust the parameters of the quantum circuit so that the objective function is minimized. However, they often suffer from the so-called vanishing gradient or barren plateau issue. On the other hand, the normalized gradient descent (NGD) method, which employs the normalized gradient vector to update the parameters, has been successfully utilized in several optimization problems. Here, we study the performance of the NGD methods in the optimization of VQAs for the first time. Our goal is twofold. The first is to examine the effectiveness of NGD and its variants for overcoming the vanishing gradient problems. The second is to propose a new NGD that can attain the faster convergence than the ordinary NGD. We performed numerical simulations of these gradient-based optimizers in the context of quantum chemistry where VQAs are used to find the ground state of a given Hamiltonian. The results show the effective convergence property of the NGD methods in VQAs, compared to the relevant optimizers without normalization. Moreover, we make use of some normalized gradient vectors at the past iteration steps to propose the novel historical NGD that has a theoretical guarantee to accelerate the convergence speed, which is observed in the numerical experiments as well.
引用
收藏
页码:1 / 9
页数:9
相关论文
共 50 条
  • [1] Natural Evolutionary Gradient Descent Strategy for Variational Quantum Algorithms
    Xie, Jianshe
    Xu, Chen
    Yin, Chenhao
    Dong, Yumin
    Zhang, Zhirong
    Intelligent Computing, 2023, 2
  • [2] Pure quantum gradient descent algorithm and full quantum variational eigensolver
    Chen, Ronghang
    Guang, Zhou
    Guo, Cong
    Feng, Guanru
    Hou, Shi-Yao
    FRONTIERS OF PHYSICS, 2024, 19 (02)
  • [3] Projected gradient descent algorithms for quantum state tomography
    Eliot Bolduc
    George C. Knee
    Erik M. Gauger
    Jonathan Leach
    npj Quantum Information, 3
  • [4] Projected gradient descent algorithms for quantum state tomography
    Bolduc, Eliot
    Knee, George C.
    Gauger, Erik M.
    Leach, Jonathan
    NPJ QUANTUM INFORMATION, 2017, 3
  • [5] A generalized normalized gradient descent algorithm
    Mandic, DP
    IEEE SIGNAL PROCESSING LETTERS, 2004, 11 (02) : 115 - 118
  • [6] Stein Variational Gradient Descent as Gradient Flow
    Liu, Qiang
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [7] Stein Variational Gradient Descent Without Gradient
    Han, Jun
    Liu, Qiang
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [8] On the convergence and improvement of stochastic normalized gradient descent
    Zhao, Shen-Yi
    Xie, Yin-Peng
    Li, Wu-Jun
    SCIENCE CHINA-INFORMATION SCIENCES, 2021, 64 (03)
  • [9] On the convergence and improvement of stochastic normalized gradient descent
    Shen-Yi Zhao
    Yin-Peng Xie
    Wu-Jun Li
    Science China Information Sciences, 2021, 64
  • [10] On the convergence and improvement of stochastic normalized gradient descent
    Shen-Yi ZHAO
    Yin-Peng XIE
    Wu-Jun LI
    ScienceChina(InformationSciences), 2021, 64 (03) : 105 - 117