Let quantum neural networks choose their own frequencies

被引:3
|
作者
Jaderberg, Ben [1 ]
Gentile, Antonio A. [1 ]
Berrada, Youssef Achari [2 ]
Shishenina, Elvira [2 ]
Elfving, Vincent E. [1 ]
机构
[1] PASQAL, 7 Rue Leonard de Vinci, F-91300 Massy, France
[2] BMW Grp, D-80788 Munich, Germany
关键词
Compendex;
D O I
10.1103/PhysRevA.109.042421
中图分类号
O43 [光学];
学科分类号
070207 ; 0803 ;
摘要
Parameterized quantum circuits as machine learning models are typically well described by their representation as a partial Fourier series of the input features, with frequencies uniquely determined by the feature map's generator Hamiltonians. Ordinarily, these data-encoding generators are chosen in advance, fixing the space of functions that can be represented. In this work we consider a generalization of quantum models to include a set of trainable parameters in the generator, leading to a trainable-frequency (TF) quantum model. We numerically demonstrate how TF models can learn generators with desirable properties for solving the task at hand, including nonregularly spaced frequencies in their spectra and flexible spectral richness. Finally, we showcase the real-world effectiveness of our approach, demonstrating an improved accuracy in solving the Navier-Stokes equations using a TF model with only a single parameter added to each encoding operation. Since TF models encompass conventional fixed-frequency models, they may offer a sensible default choice for variational quantum machine learning.
引用
收藏
页数:10
相关论文
共 50 条
  • [31] On the complexity of quantum link prediction in complex networks
    Moutinho, João P.
    Magano, Duarte
    Coutinho, Bruno
    arXiv, 2022,
  • [32] Visualizing a neural network that develops quantum perturbation theory
    Wu Y.
    Zhang P.
    Shen H.
    Zhai H.
    2018, American Physical Society (98)
  • [33] Clustering neural quantum states via diffusion maps
    Teng, Yanting
    Sachdev, Subir
    Scheurer, Mathias S.
    PHYSICAL REVIEW B, 2023, 108 (20)
  • [34] Neural network backflow for ab initio quantum chemistry
    Liu, An-Jun
    Clark, Bryan K.
    PHYSICAL REVIEW B, 2024, 110 (11)
  • [35] Stability & Generalisation of Gradient Descent for Shallow Neural Networks without the Neural Tangent Kernel
    Richards, Dominic
    Kuzborskij, Ilja
    Advances in Neural Information Processing Systems, 2021, 11 : 8609 - 8621
  • [36] Evolutionary algorithms as an alternative to backpropagation for supervised training of Biophysical Neural Networks and Neural ODEs
    Hazelden, James
    Liu, Yuhan Helena
    Shlizerman, Eli
    Shea-Brown, Eric
    arXiv, 2023,
  • [37] Unraveling the complexity of neural networks Comment on "Structure and function in artificial, zebrafish and human neural networks" by Peng Ji et al.
    Kang, Ling
    Liu, Zonghua
    PHYSICS OF LIFE REVIEWS, 2023, 46 : 158 - 160
  • [38] Pruning Randomly Initialized Neural Networks with Iterative Randomization
    Chijiwa, Daiki
    Yamaguchi, Shinya
    Ida, Yasutoshi
    Umakoshi, Kenji
    Inoue, Tomohiro
    arXiv, 2021,
  • [39] Contribution of neural networks in image steganography, watermarking and encryption
    Rafidison, Maminiaina Alphonse
    Rafanantenana, Sabine Harisoa Jacques
    Rakotomihamina, Andry Harivony
    Toky, Rajaonarison Faniriharisoa Maxime
    Ramafiarisona, Hajasoa Malalatiana
    IET IMAGE PROCESSING, 2023, 17 (02) : 463 - 479
  • [40] Network security situation awareness based on neural networks
    Xie, Lixia
    Wang, Yachao
    Yu, Jinbo
    Qinghua Daxue Xuebao/Journal of Tsinghua University, 2013, 53 (12): : 1750 - 1760