An exponential reduction in training data sizes for machine learning derived entanglement witnesses

被引:0
作者
Rosebush, Aiden R. [1 ]
Greenwood, Alexander C. B. [1 ]
Kirby, Brian T. [2 ,3 ]
Qian, Li [1 ]
机构
[1] Univ Toronto, Dept Elect & Comp Engn, Toronto, ON M5S 3G4, Canada
[2] DEVCOM Army Res Lab, Adelphi, MD 20783 USA
[3] Tulane Univ, New Orleans, LA 70118 USA
来源
MACHINE LEARNING-SCIENCE AND TECHNOLOGY | 2024年 / 5卷 / 03期
基金
加拿大创新基金会; 加拿大自然科学与工程研究理事会;
关键词
quantum entanglement; entanglement witnesses; support vector machines; machine learning; differential programming; GENERATION; STATES;
D O I
10.1088/2632-2153/ad7457
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose a support vector machine (SVM) based approach for generating an entanglement witness that requires exponentially less training data than previously proposed methods. SVMs generate hyperplanes represented by a weighted sum of expectation values of local observables whose coefficients are optimized to sum to a positive number for all separable states and a negative number for as many entangled states as possible near a specific target state. Previous SVM-based approaches for entanglement witness generation used large amounts of randomly generated separable states to perform training, a task with considerable computational overhead. Here, we propose a method for orienting the witness hyperplane using only the significantly smaller set of states consisting of the eigenstates of the generalized Pauli matrices and a set of entangled states near the target entangled states. With the orientation of the witness hyperplane set by the SVM, we tune the plane's placement using a differential program that ensures perfect classification accuracy on a limited test set as well as maximal noise tolerance. For N qubits, the SVM portion of this approach requires only O(6N) training states, whereas an existing method needs O(24N). We use this method to construct witnesses of 4 and 5 qubit GHZ states with coefficients agreeing with stabilizer formalism witnesses to within 3.7 percent and 1 percent, respectively. We also use the same training states to generate novel 4 and 5 qubit W state witnesses. Finally, we computationally verify these witnesses on small test sets and propose methods for further verification.
引用
收藏
页数:24
相关论文
共 50 条
  • [31] Optimizing data acquisition: a Bayesian approach for efficient machine learning model training
    Mahani, M. R.
    Nechepurenko, Igor A.
    Rahimof, Yasmin
    Wicht, Andreas
    MACHINE LEARNING-SCIENCE AND TECHNOLOGY, 2024, 5 (03):
  • [32] Prediction of biochar yield based on machine learning model of "enhanced data" training
    Zhao, Chenxi
    Jiang, Zihao
    Lu, Xueying
    Yue, Wenjing
    Chen, Juhui
    Liu, Xiaogang
    BIOMASS & BIOENERGY, 2024, 182
  • [33] Training from Zero: Forecasting of Radio Frequency Machine Learning Data Quantity
    Clark IV, William H.
    Michaels, Alan J.
    TELECOM, 2024, 5 (03): : 632 - 651
  • [34] A CONCEPTUAL MODEL FOR CONVERTING OPENSTREETMAP CONTRIBUTION TO GEOSPATIAL MACHINE LEARNING TRAINING DATA
    Li, H.
    Zipf, A.
    XXIV ISPRS CONGRESS IMAGING TODAY, FORESEEING TOMORROW, COMMISSION IV, 2022, 43-B4 : 253 - 259
  • [35] Inverse Biomechanical Modeling of the Tongue via Machine Learning and Synthetic Training Data
    Tolpadi, Aniket A.
    Stone, Maureen L.
    Carass, Aaron
    Prince, Jerry L.
    Gomez, Arnold D.
    MEDICAL IMAGING 2018: IMAGE-GUIDED PROCEDURES, ROBOTIC INTERVENTIONS, AND MODELING, 2018, 10576
  • [36] Training data distribution significantly impacts the estimation of tissue microstructure with machine learning
    Gyori, Noemi G.
    Palombo, Marco
    Clark, Christopher A.
    Zhang, Hui
    Alexander, Daniel C.
    MAGNETIC RESONANCE IN MEDICINE, 2022, 87 (02) : 932 - 947
  • [37] Imbalanced generative sampling of training data for improving quality of machine learning model
    Coskun, Umut Can
    Dogan, Kemal Mert
    Gunpinar, Erkan
    ADVANCED ENGINEERING INFORMATICS, 2024, 62
  • [38] Linking Human And Machine Behavior: A New Approach to Evaluate Training Data Quality for Beneficial Machine Learning
    Hagendorff, Thilo
    MINDS AND MACHINES, 2021, 31 (04) : 563 - 593
  • [39] Performance Evaluation of Supervised Machine Learning Algorithms Using Different Data Set Sizes for Diabetes Prediction
    Radja, Melky
    Emanuel, Andi Wahju Rahardjo
    2019 5TH INTERNATIONAL CONFERENCE ON SCIENCE ININFORMATION TECHNOLOGY (ICSITECH): EMBRACING INDUSTRY 4.0 - TOWARDS INNOVATION IN CYBER PHYSICAL SYSTEM, 2019, : 252 - 258
  • [40] Linking Human And Machine Behavior: A New Approach to Evaluate Training Data Quality for Beneficial Machine Learning
    Thilo Hagendorff
    Minds and Machines, 2021, 31 : 563 - 593