Quantum variational algorithms are swamped with traps

被引:91
作者
Anschuetz, Eric R. [1 ]
Kiani, Bobak T. [2 ]
机构
[1] MIT, Ctr Theoret Phys, 77 Massachusetts Ave, Cambridge, MA 02139 USA
[2] MIT, Dept Elect Engn & Comp Sci, 77 Massachusetts Ave, Cambridge, MA 02139 USA
基金
美国国家科学基金会;
关键词
D O I
10.1038/s41467-022-35364-5
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
One of the most important properties of classical neural networks is how surprisingly trainable they are, though their training algorithms typically rely on optimizing complicated, nonconvex loss functions. Previous results have shown that unlike the case in classical neural networks, variational quantum models are often not trainable. The most studied phenomenon is the onset of barren plateaus in the training landscape of these quantum models, typically when the models are very deep. This focus on barren plateaus has made the phenomenon almost synonymous with the trainability of quantum models. Here, we show that barren plateaus are only a part of the story. We prove that a wide class of variational quantum models-which are shallow, and exhibit no barren plateaus-have only a superpolynomially small fraction of local minima within any constant energy from the global minimum, rendering these models untrainable if no good initial guess of the optimal parameters is known. We also study the trainability of variational quantum algorithms from a statistical query framework, and show that noisy optimization of a wide variety of quantum models is impossible with a sub-exponential number of queries. Finally, we numerically confirm our results on a variety of problem instances. Though we exclude a wide variety of quantum algorithms here, we give reason for optimism for certain classes of variational algorithms and discuss potential ways forward in showing the practical utility of such algorithms.
引用
收藏
页数:10
相关论文
共 59 条
  • [1] Aharonov D., 2006, STOC'06. Proceedings of the 38th Annual ACM Symposium on Theory of Computing, P427, DOI 10.1145/1132516.1132579
  • [2] Quantum Boltzmann Machine
    Amin, Mohammad H.
    Andriyash, Evgeny
    Rolfe, Jason
    Kulchytskyy, Bohdan
    Melko, Roger
    [J]. PHYSICAL REVIEW X, 2018, 8 (02):
  • [3] Anschuetz ER., 2022, Critical points in quantum generative models
  • [4] Sample-efficient learning of interacting quantum systems
    Anshu, Anurag
    Arunachalam, Srinivasan
    Kuwahara, Tomotaka
    Soleimanifar, Mehdi
    [J]. NATURE PHYSICS, 2021, 17 (08) : 931 - +
  • [5] Effect of barren plateaus on gradient-free optimization
    Arrasmith, Andrew
    Cerezo, M.
    Czarnik, Piotr
    Cincio, Lukasz
    Coles, Patrick J.
    [J]. QUANTUM, 2021, 5
  • [6] Learning a Local Hamiltonian from Local Measurements
    Bairey, Eyal
    Arad, Itai
    Lindner, Netanel H.
    [J]. PHYSICAL REVIEW LETTERS, 2019, 122 (02)
  • [7] Balázs S, 2009, LECT NOTES ARTIF INT, V5809, P186, DOI 10.1007/978-3-642-04414-4_18
  • [8] Training deep quantum neural networks
    Beer, Kerstin
    Bondarenko, Dmytro
    Farrelly, Terry
    Osborne, Tobias J.
    Salzmann, Robert
    Scheiermann, Daniel
    Wolf, Ramona
    [J]. NATURE COMMUNICATIONS, 2020, 11 (01)
  • [9] Bkiani, 2022, Zenodo, DOI 10.5281/ZENODO.7338595
  • [10] Blum A., 1994, Proceedings of the Twenty-Sixth Annual ACM Symposium on the Theory of Computing, P253, DOI 10.1145/195058.195147