Learning Feynman Diagrams with Tensor Trains

被引:29
|
作者
Fernandez, Yuriel Nunez [1 ]
Jeannin, Matthieu [1 ]
Dumitrescu, Philipp T. [2 ]
Kloss, Thomas [1 ,3 ]
Kaye, Jason [2 ,4 ]
Parcollet, Olivier [2 ,5 ]
Waintal, Xavier [1 ]
机构
[1] Univ Grenoble Alpes, CEA, Grenoble INP, IRIG,Pheliqs, F-38000 Grenoble, France
[2] Flatiron Inst, Ctr Computat Quantum Phys, 162 5th Ave, New York, NY 10010 USA
[3] Univ Grenoble Alpes, Inst Neel, CNRS, F-38000 Grenoble, France
[4] Flatiron Inst, Ctr Computat Math, 162 5th Ave, New York, NY 10010 USA
[5] Univ Paris Saclay, Inst Phys Theor, CNRS, CEA, F-91191 Gif Sur Yvette, France
关键词
QUANTUM; APPROXIMATION; MATRIX; QUASIOPTIMALITY;
D O I
10.1103/PhysRevX.12.041018
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
We use tensor network techniques to obtain high-order perturbative diagrammatic expansions for the quantum many-body problem at very high precision. The approach is based on a tensor train parsimonious representation of the sum of all Feynman diagrams, obtained in a controlled and accurate way with the tensor cross interpolation algorithm. It yields the full time evolution of physical quantities in the presence of any arbitrary time-dependent interaction. Our benchmarks on the Anderson quantum impurity problem, within the real-time nonequilibrium Schwinger-Keldysh formalism, demonstrate that this technique supersedes diagrammatic quantum Monte Carlo by orders of magnitude in precision and speed, with convergence rates 1/N2 or faster, where N is the number of function evaluations. The method also works in parameter regimes characterized by strongly oscillatory integrals in high dimension, which suffer from a catastrophic sign problem in quantum Monte Carlo calculations. Finally, we also present two exploratory studies showing that the technique generalizes to more complex situations: a double quantum dot and a single impurity embedded in a two-dimensional lattice.
引用
收藏
页数:30
相关论文
共 50 条
  • [31] Machine learning by unitary tensor network of hierarchical tree structure
    Liu, Ding
    Ran, Shi-Ju
    Wittek, Peter
    Peng, Cheng
    Garcia, Raul Blazquez
    Su, Gang
    Lewenstein, Maciej
    NEW JOURNAL OF PHYSICS, 2019, 21 (07)
  • [32] Tensor Decompositions for Learning Latent Variable Models (A Survey for ALT)
    Anandkumar, Anima
    Ge, Rong
    Hsu, Daniel
    Kakade, Sham M.
    Telgarsky, Matus
    ALGORITHMIC LEARNING THEORY, ALT 2015, 2015, 9355 : 19 - 38
  • [33] Learning with tree tensor networks: Complexity estimates and model selection
    Michel, Bertrand
    Nouy, Anthony
    BERNOULLI, 2022, 28 (02) : 910 - 936
  • [34] Learning image manifold via local tensor subspace alignment
    Wu, Songsong
    Jing, Xiaoyuan
    Wei, Zhisen
    Yang, Jian
    Yang, Jingyu
    NEUROCOMPUTING, 2014, 139 : 22 - 33
  • [35] Generative tensor network classification model for supervised machine learning
    Sun, Zheng-Zhi
    Peng, Cheng
    Liu, Ding
    Ran, Shi-Ju
    Su, Gang
    PHYSICAL REVIEW B, 2020, 101 (07)
  • [36] Tangent-space gradient optimization of tensor network for machine learning
    Sun, Zheng-Zhi
    Ran, Shi-Ju
    Su, Gang
    PHYSICAL REVIEW E, 2020, 102 (01)
  • [37] SpTFS: Sparse Tensor Format Selection for MTTKRP via Deep Learning
    Sun, Qingxiao
    Liu, Yi
    Dun, Ming
    Yang, Hailong
    Luan, Zhongzhi
    Gan, Lin
    Yang, Guangwen
    Qian, Depei
    PROCEEDINGS OF SC20: THE INTERNATIONAL CONFERENCE FOR HIGH PERFORMANCE COMPUTING, NETWORKING, STORAGE AND ANALYSIS (SC20), 2020,
  • [38] Entanglement-Based Feature Extraction by Tensor Network Machine Learning
    Liu, Yuhan
    Li, Wen-Jun
    Zhang, Xiao
    Lewenstein, Maciej
    Su, Gang
    Ran, Shi-Ju
    FRONTIERS IN APPLIED MATHEMATICS AND STATISTICS, 2021, 7
  • [39] Tensor-Train networks for learning predictive modeling of multidimensional data
    da Costa, Nazareth
    Attux, Romis
    Cichocki, Andrzej
    Romano, Joao M. T.
    NEUROCOMPUTING, 2025, 637
  • [40] Image Representation and Learning With Graph-Laplacian Tucker Tensor Decomposition
    Jiang, Bo
    Ding, Chris
    Tang, Jin
    Luo, Bin
    IEEE TRANSACTIONS ON CYBERNETICS, 2019, 49 (04) : 1417 - 1426