Bridging the Gap between Transformer-Based Neural Networks and Tensor Networks for Quantum Chemistry

被引:0
|
作者
Kan, Bowen [1 ,2 ]
Tian, Yingqi [1 ]
Wu, Yangjun [3 ]
Zhang, Yunquan [1 ]
Shang, Honghui [3 ]
机构
[1] Chinese Acad Sci, Inst Comp Technol, Beijing 100190, Peoples R China
[2] Univ Chinese Acad Sci, Beijing 100190, Peoples R China
[3] Univ Sci & Technol China, Key Lab Precis & Intelligent Chem, Hefei 230026, Peoples R China
基金
中国国家自然科学基金;
关键词
WAVE-FUNCTIONS; STATE;
D O I
10.1021/acs.jctc.4c01703
中图分类号
O64 [物理化学(理论化学)、化学物理学];
学科分类号
070304 ; 081704 ;
摘要
The neural network quantum state (NNQS) method has demonstrated promising results in ab initio quantum chemistry, achieving remarkable accuracy in molecular systems. However, efficient calculation of systems with large active spaces remains challenging. This study introduces a novel approach that bridges tensor network states with the transformer-based NNQS-Transformer (QiankunNet) to enhance accuracy and convergence for systems with relatively large active spaces. By transforming tensor network states into active space configuration interaction type wave functions, QiankunNet achieves accuracy surpassing both the pretraining density matrix renormalization group (DMRG) results and traditional coupled cluster methods, particularly in strongly correlated regimes. We investigate two configuration transformation methods: the sweep-based direct conversion (Conv.) method and the entanglement-driven genetic algorithm (EDGA) method, with Conv. showing superior efficiency. The effectiveness of this approach is validated on H2O with a large active space (10e, 24o) in the cc-pVDZ basis set, demonstrating an efficient routine between DMRG and QiankunNet and also offering a promising direction for advancing quantum state representation in complex molecular systems.
引用
收藏
页码:3426 / 3439
页数:14
相关论文
共 50 条
  • [1] Exploiting Transformer-Based Networks and Boosting Algorithms for Ultralow Compressible Boride Design
    Siriwardane, Edirisuriya M. Dilanga
    Dong, Rongzhi
    Hu, Jianjun
    Cakir, Deniz
    JOURNAL OF PHYSICAL CHEMISTRY C, 2025, 129 (17) : 8326 - 8338
  • [2] Building quantum neural networks based on a swap test
    Zhao, Jian
    Zhang, Yuan-Hang
    Shao, Chang-Peng
    Wu, Yu-Chun
    Guo, Guang-Can
    Guo, Guo-Ping
    PHYSICAL REVIEW A, 2019, 100 (01)
  • [3] Transformer neural networks and quantum simulators: a hybrid approach for simulating strongly correlated systems
    Lange, Hannah
    Bornet, Guillaume
    Emperauger, Gabriel
    Chen, Cheng
    Lahaye, Thierry
    Kienle, Stefan
    Browaeys, Antoine
    Bohrdt, Annabelle
    QUANTUM, 2025, 9
  • [4] Accurate Calculation of Interatomic Forces with Neural Networks Based on a Generative Transformer Architecture
    Lai, Juntao
    Kan, Bowen
    Wu, Yangjun
    Fu, Qiang
    Shang, Honghui
    Li, Zhenyu
    Yang, Jinlong
    JOURNAL OF CHEMICAL THEORY AND COMPUTATION, 2024, 20 (21) : 9478 - 9487
  • [5] Unifying neural-network quantum states and correlator product states via tensor networks
    Clark, Stephen R.
    JOURNAL OF PHYSICS A-MATHEMATICAL AND THEORETICAL, 2018, 51 (13)
  • [6] Quantum codes from neural networks
    Bausch, Johannes
    Leditzky, Felix
    NEW JOURNAL OF PHYSICS, 2020, 22 (02):
  • [7] Bridging the gap between quantum Monte Carlo and F12-methods
    Chinnamsetty, Sambasiva Rao
    Luo, Hongjun
    Hackbusch, Wolfgang
    Flad, Heinz-Juergen
    Uschmajew, Andre
    CHEMICAL PHYSICS, 2012, 401 : 36 - 44
  • [8] Accurate computation of quantum excited states with neural networks
    Pfau, David
    Axelrod, Simon
    Sutterud, Halvard
    von Glehn, Ingrid
    Spencer, James S.
    SCIENCE, 2024, 385 (6711) : eadn0137
  • [9] Artificial Neural Networks-Based Machine Learning for Wireless Networks: A Tutorial
    Chen, Mingzhe
    Challita, Ursula
    Saad, Walid
    Yin, Changchuan
    Debbah, Merouane
    IEEE COMMUNICATIONS SURVEYS AND TUTORIALS, 2019, 21 (04): : 3039 - 3071
  • [10] Remaining Useful Life Prediction of Lithium-Ion Batteries by Using a Denoising Transformer-Based Neural Network
    Han, Yunlong
    Li, Conghui
    Zheng, Linfeng
    Lei, Gang
    Li, Li
    ENERGIES, 2023, 16 (17)