A General Approach to Dropout in Quantum Neural Networks

被引:5
作者
Scala, Francesco [1 ]
Ceschini, Andrea [2 ]
Panella, Massimo [2 ]
Gerace, Dario [1 ]
机构
[1] Univ Pavia, Dipartimento Fis, Pavia, Italy
[2] Univ Roma La Sapienza, Dipartimento Ingn Informaz Elettron & Telecomunica, Rome, Italy
关键词
dropout; overfitting; overparametrization; quantum neural networks;
D O I
10.1002/qute.202300220
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
In classical machine learning (ML), "overfitting" is the phenomenon occurring when a given model learns the training data excessively well, and it thus performs poorly on unseen data. A commonly employed technique in ML is the so called "dropout," which prevents computational units from becoming too specialized, hence reducing the risk of overfitting. With the advent of quantum neural networks (QNNs) as learning models, overfitting might soon become an issue, owing to the increasing depth of quantum circuits as well as multiple embedding of classical features, which are employed to give the computational nonlinearity. Here, a generalized approach is presented to apply the dropout technique in QNN models, defining and analyzing different quantum dropout strategies to avoid overfitting and achieve a high level of generalization. This study allows to envision the power of quantum dropout in enabling generalization, providing useful guidelines on determining the maximal dropout probability for a given model, based on overparametrization theory. It also highlights how quantum dropout does not impact the features of the QNN models, such as expressibility and entanglement. All these conclusions are supported by extensive numerical simulations and may pave the way to efficiently employing deep quantum machine learning (QML) models based on state-of-the-art QNNs. Randomly dropping artificial neurons and all their connections in the training phase reduces overfitting issues in classical neural networks, thus improving performances on previously unseen data. The authors introduce different dropout strategies applied to quantum neural networks, learning models based on parametrized quantum circuits. Quantum dropout strategies might help reducing overfitting without impacting the expressibility and entanglement of these models.image
引用
收藏
页数:18
相关论文
共 56 条
  • [21] García-Martín D, 2024, Arxiv, DOI arXiv:2302.05059
  • [22] Gil-Fuster E, 2024, Arxiv, DOI arXiv:2306.13461
  • [23] Entanglement detection
    Guehne, Otfried
    Toth, Geza
    [J]. PHYSICS REPORTS-REVIEW SECTION OF PHYSICS LETTERS, 2009, 474 (1-6): : 1 - 75
  • [24] Capacity and Quantum Geometry of Parametrized Quantum Circuits
    Haug, Tobias
    Bharti, Kishor
    Kim, M. S.
    [J]. PRX QUANTUM, 2021, 2 (04):
  • [25] The problem of overfitting
    Hawkins, DM
    [J]. JOURNAL OF CHEMICAL INFORMATION AND COMPUTER SCIENCES, 2004, 44 (01): : 1 - 12
  • [26] Connecting Ansatz Expressibility to Gradient Magnitudes and Barren Plateaus
    Holmes, Zoe
    Sharma, Kunal
    Cerezo, M.
    Coles, Patrick J.
    [J]. PRX QUANTUM, 2022, 3 (01):
  • [27] Quantum advantage in learning from experiments
    Huang, Hsin-Yuan
    Broughton, Michael
    Cotler, Jordan
    Chen, Sitan
    Li, Jerry
    Mohseni, Masoud
    Neven, Hartmut
    Babbush, Ryan
    Kueng, Richard
    Preskill, John
    McClean, Jarrod R.
    [J]. SCIENCE, 2022, 376 (6598) : 1182 - +
  • [28] Evaluation of parameterized quantum circuits: on the relation between classification accuracy, expressibility, and entangling capability
    Hubregtsen, Thomas
    Pichlmeier, Josef
    Stecher, Patrick
    Bertels, Koen
    [J]. QUANTUM MACHINE INTELLIGENCE, 2021, 3 (01)
  • [29] Kingma DP, 2014, ADV NEUR IN, V27
  • [30] Overfitting in quantum machine learning and entangling dropout
    Kobayashi, Masahiro
    Nakaji, Kouhei
    Yamamoto, Naoki
    [J]. QUANTUM MACHINE INTELLIGENCE, 2022, 4 (02)