Quantum Optical Experiments Modeled by Long Short-Term Memory

被引:7
|
作者
Adler, Thomas [1 ]
Erhard, Manuel [2 ,3 ,8 ]
Krenn, Mario [4 ,5 ,6 ,9 ]
Brandstetter, Johannes [1 ,10 ]
Kofler, Johannes [1 ]
Hochreiter, Sepp [1 ,7 ]
机构
[1] Johannes Kepler Univ Linz, Inst Machine Learning, ELLIS Unit Linz, LIT AI Lab, A-4040 Linz, Austria
[2] Univ Vienna, Austrian Acad Sci, Inst Quantum Opt & Quantum Informat, A-1090 Vienna, Austria
[3] Univ Vienna, Vienna Ctr Quantum Sci & Technol, A-1090 Vienna, Austria
[4] Univ Toronto, Dept Chem, Toronto, ON M5G 1M1, Canada
[5] Vector Inst Artificial Intelligence, Toronto, ON M5G 1M1, Canada
[6] Univ Toronto, Dept Comp Sci, Toronto, ON M5G 1M1, Canada
[7] Inst Adv Res Artificial Intelligence IARAI, Landstrasser Hauptstr 5, A-1030 Vienna, Austria
[8] Quantum Technol Labs GmbH, Wohllebengasse 4-4, A-1040 Vienna, Austria
[9] Max Planck Inst Sci Light, D-91058 Erlangen, Germany
[10] Univ Amsterdam, Fac Sci, Informat Inst, NL-1090 GH Amsterdam, Netherlands
基金
奥地利科学基金会;
关键词
quantum optics; multipartite high-dimensional entanglement; supervised machine learning; long short-term memory; ENTANGLEMENT;
D O I
10.3390/photonics8120535
中图分类号
O43 [光学];
学科分类号
070207 ; 0803 ;
摘要
We demonstrate how machine learning is able to model experiments in quantum physics. Quantum entanglement is a cornerstone for upcoming quantum technologies, such as quantum computation and quantum cryptography. Of particular interest are complex quantum states with more than two particles and a large number of entangled quantum levels. Given such a multiparticle high-dimensional quantum state, it is usually impossible to reconstruct an experimental setup that produces it. To search for interesting experiments, one thus has to randomly create millions of setups on a computer and calculate the respective output states. In this work, we show that machine learning models can provide significant improvement over random search. We demonstrate that a long short-term memory (LSTM) neural network can successfully learn to model quantum experiments by correctly predicting output state characteristics for given setups without the necessity of computing the states themselves. This approach not only allows for faster search, but is also an essential step towards the automated design of multiparticle high-dimensional quantum experiments using generative machine learning models.
引用
收藏
页数:9
相关论文
共 50 条
  • [1] QUANTUM LONG SHORT-TERM MEMORY
    Chen, Samuel Yen-Chi
    Yoo, Shinjae
    Fang, Yao-Lung L.
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 8622 - 8626
  • [2] Optical Music Recognition by Long Short-Term Memory Networks
    Baro, Arnau
    Riba, Pau
    Calvo-Zaragoza, Jorge
    Fornes, Alicia
    GRAPHICS RECOGNITION: CURRENT TRENDS AND EVOLUTIONS, GREC 2017, 2018, 11009 : 81 - 95
  • [3] Short-term Load Forecasting with Distributed Long Short-Term Memory
    Dong, Yi
    Chen, Yang
    Zhao, Xingyu
    Huang, Xiaowei
    2023 IEEE POWER & ENERGY SOCIETY INNOVATIVE SMART GRID TECHNOLOGIES CONFERENCE, ISGT, 2023,
  • [4] LIPREADING WITH LONG SHORT-TERM MEMORY
    Wand, Michael
    Koutnik, Jan
    Schmidhuber, Jurgen
    2016 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING PROCEEDINGS, 2016, : 6115 - 6119
  • [5] Short-Term Load Forecasting using A Long Short-Term Memory Network
    Liu, Chang
    Jin, Zhijian
    Gu, Jie
    Qiu, Caiming
    2017 IEEE PES INNOVATIVE SMART GRID TECHNOLOGIES CONFERENCE EUROPE (ISGT-EUROPE), 2017,
  • [6] Quantum Convolutional Long Short-Term Memory Based on Variational Quantum Algorithms in the Era of NISQ
    Xu, Zeyu
    Yu, Wenbin
    Zhang, Chengjun
    Chen, Yadang
    INFORMATION, 2024, 15 (04)
  • [7] A review on the long short-term memory model
    Van Houdt, Greg
    Mosquera, Carlos
    Napoles, Gonzalo
    ARTIFICIAL INTELLIGENCE REVIEW, 2020, 53 (08) : 5929 - 5955
  • [8] Video Summarization with Long Short-Term Memory
    Zhang, Ke
    Chao, Wei-Lun
    Sha, Fei
    Grauman, Kristen
    COMPUTER VISION - ECCV 2016, PT VII, 2016, 9911 : 766 - 782
  • [9] On the Initialization of Long Short-Term Memory Networks
    Ghazi, Mostafa Mehdipour
    Nielsen, Mads
    Pai, Akshay
    Modat, Marc
    Cardoso, M. Jorge
    Ourselin, Sebastien
    Sorensen, Lauge
    NEURAL INFORMATION PROCESSING (ICONIP 2019), PT I, 2019, 11953 : 275 - 286
  • [10] A review on the long short-term memory model
    Greg Van Houdt
    Carlos Mosquera
    Gonzalo Nápoles
    Artificial Intelligence Review, 2020, 53 : 5929 - 5955