An exponential reduction in training data sizes for machine learning derived entanglement witnesses

被引:0
作者
Rosebush, Aiden R. [1 ]
Greenwood, Alexander C. B. [1 ]
Kirby, Brian T. [2 ,3 ]
Qian, Li [1 ]
机构
[1] Univ Toronto, Dept Elect & Comp Engn, Toronto, ON M5S 3G4, Canada
[2] DEVCOM Army Res Lab, Adelphi, MD 20783 USA
[3] Tulane Univ, New Orleans, LA 70118 USA
来源
MACHINE LEARNING-SCIENCE AND TECHNOLOGY | 2024年 / 5卷 / 03期
基金
加拿大创新基金会; 加拿大自然科学与工程研究理事会;
关键词
quantum entanglement; entanglement witnesses; support vector machines; machine learning; differential programming; GENERATION; STATES;
D O I
10.1088/2632-2153/ad7457
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose a support vector machine (SVM) based approach for generating an entanglement witness that requires exponentially less training data than previously proposed methods. SVMs generate hyperplanes represented by a weighted sum of expectation values of local observables whose coefficients are optimized to sum to a positive number for all separable states and a negative number for as many entangled states as possible near a specific target state. Previous SVM-based approaches for entanglement witness generation used large amounts of randomly generated separable states to perform training, a task with considerable computational overhead. Here, we propose a method for orienting the witness hyperplane using only the significantly smaller set of states consisting of the eigenstates of the generalized Pauli matrices and a set of entangled states near the target entangled states. With the orientation of the witness hyperplane set by the SVM, we tune the plane's placement using a differential program that ensures perfect classification accuracy on a limited test set as well as maximal noise tolerance. For N qubits, the SVM portion of this approach requires only O(6N) training states, whereas an existing method needs O(24N). We use this method to construct witnesses of 4 and 5 qubit GHZ states with coefficients agreeing with stabilizer formalism witnesses to within 3.7 percent and 1 percent, respectively. We also use the same training states to generate novel 4 and 5 qubit W state witnesses. Finally, we computationally verify these witnesses on small test sets and propose methods for further verification.
引用
收藏
页数:24
相关论文
共 50 条
  • [21] Overlapped Data Processing Scheme for Accelerating Training and Validation in Machine Learning
    Choi, Jinseo
    Kang, Donghyun
    IEEE ACCESS, 2022, 10 : 72015 - 72023
  • [22] When Machine Learning Models Leak: An Exploration of Synthetic Training Data
    Slokom, Manel
    De Wolf, Peter-Paul
    Larson, Martha
    PRIVACY IN STATISTICAL DATABASES, PSD 2022, 2022, 13463 : 283 - 296
  • [23] Robust training of machine learning interatomic potentials with dimensionality reduction and stratified sampling
    Qi, Ji
    Ko, Tsz Wai
    Wood, Brandon C.
    Pham, Tuan Anh
    Ong, Shyue Ping
    NPJ COMPUTATIONAL MATERIALS, 2024, 10 (01)
  • [24] Protecting Machine Learning Models from Training Data Set Extraction
    Kalinin, M. O.
    Muryleva, A. A.
    Platonov, V. V.
    AUTOMATIC CONTROL AND COMPUTER SCIENCES, 2024, 58 (08) : 1234 - 1241
  • [25] A supervised machine learning workflow for the reduction of highly dimensional biological data
    Andersen, Linnea K.
    Reading, Benjamin J.
    ARTIFICIAL INTELLIGENCE IN THE LIFE SCIENCES, 2024, 5
  • [26] Noise Reduction in Spatial Data using Machine Learning Methods for Road Condition Data
    Kumari, Dara Anitha
    Govardhan, A.
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2020, 11 (01) : 154 - 163
  • [27] Data reduction in deterministic neutron transport calculations using machine learning
    Whewell, Ben
    McClarren, Ryan G.
    ANNALS OF NUCLEAR ENERGY, 2022, 176
  • [28] Data Science and Machine Learning Teaching Practices with Focus on Vocational Education and Training
    Nadzinski, Gorjan
    Gerazov, Branislav
    Zlatinov, Stefan
    Kartalov, Tomislav
    Dimitrovska, Marija M. A. R. K. O. V. S. K. A.
    Gjoreski, Hristijan
    Chavdarov, Risto
    Kokolanski, Zivko
    Atanasov, Igor
    Horstmann, Jelena
    Sterle, Uros
    Gams, Matjaz
    INFORMATICS IN EDUCATION, 2023, 22 (04): : 671 - 690
  • [29] Analysis of Image Thresholding Algorithms for Automated Machine Learning Training Data Generation
    Creek, Tristan
    Mullins, Barry E.
    PROCEEDINGS OF THE 17TH INTERNATIONAL CONFERENCE ON CYBER WARFARE AND SECURITY (ICCWS 2022), 2022, : 449 - 458
  • [30] Fast Training Data Generation for Machine Learning Analysis of Cosmic Ray Showers
    Hachaj, Tomasz
    Bibrzycki, Lukasz
    Piekarczyk, Marcin
    IEEE ACCESS, 2023, 11 : 7410 - 7419