Quantum self-supervised learning

被引:22
作者
Jaderberg, B. [1 ]
Anderson, L. W. [1 ]
Xie, W. [2 ]
Albanie, S. [3 ]
Kiffner, M. [1 ,4 ]
Jaksch, D. [1 ,4 ,5 ]
机构
[1] Univ Oxford, Clarendon Lab, Parks Rd, Oxford OX1 3PU, England
[2] Univ Oxford, Dept Engn Sci, Visual Geometry Grp, Oxford, England
[3] Univ Cambridge, Dept Engn, Cambridge, England
[4] Natl Univ Singapore, Ctr Quantum Technol, 3 Sci Dr 2, Singapore 117543, Singapore
[5] Univ Hamburg, Inst Laserphys, D-22761 Hamburg, Germany
来源
QUANTUM SCIENCE AND TECHNOLOGY | 2022年 / 7卷 / 03期
基金
英国工程与自然科学研究理事会; 新加坡国家研究基金会;
关键词
variational quantum algorithms; quantum machine learning; self-supervised learning; deep learning; quantum neural networks; REPRESENTATION; ALGORITHM;
D O I
10.1088/2058-9565/ac6825
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
The resurgence of self-supervised learning, whereby a deep learning model generates its own supervisory signal from the data, promises a scalable way to tackle the dramatically increasing size of real-world data sets without human annotation. However, the staggering computational complexity of these methods is such that for state-of-the-art performance, classical hardware requirements represent a significant bottleneck to further progress. Here we take the first steps to understanding whether quantum neural networks (QNNs) could meet the demand for more powerful architectures and test its effectiveness in proof-of-principle hybrid experiments. Interestingly, we observe a numerical advantage for the learning of visual representations using small-scale QNN over equivalently structured classical networks, even when the quantum circuits are sampled with only 100 shots. Furthermore, we apply our best quantum model to classify unseen images on the ibmq_paris quantum computer and find that current noisy devices can already achieve equal accuracy to the equivalent classical model on downstream tasks.
引用
收藏
页数:15
相关论文
共 93 条
[1]  
Abbas A., ARXIV211204807
[2]   The power of quantum neural networks [J].
Abbas, Amira ;
Sutter, David ;
Zoufal, Christa ;
Lucchi, Aurelien ;
Figalli, Alessio ;
Woerner, Stefan .
NATURE COMPUTATIONAL SCIENCE, 2021, 1 (06) :403-409
[3]  
Aleksandrowicz G., 2019, METHOD PRODUCING HUM
[4]   Supervised Quantum Learning without Measurements [J].
Alvarez-Rodriguez, Unai ;
Lamata, Lucas ;
Escandell-Montero, Pablo ;
Martin-Guerrero, Jose D. ;
Solano, Enrique .
SCIENTIFIC REPORTS, 2017, 7
[5]   Quantum Boltzmann Machine [J].
Amin, Mohammad H. ;
Andriyash, Evgeny ;
Rolfe, Jason ;
Kulchytskyy, Bohdan ;
Melko, Roger .
PHYSICAL REVIEW X, 2018, 8 (02)
[6]  
[Anonymous], 2021, IBM quantum
[7]  
[Anonymous], 1993, INT C NEUR INF PROC
[8]   Quantum supremacy using a programmable superconducting processor [J].
Arute, Frank ;
Arya, Kunal ;
Babbush, Ryan ;
Bacon, Dave ;
Bardin, Joseph C. ;
Barends, Rami ;
Biswas, Rupak ;
Boixo, Sergio ;
Brandao, Fernando G. S. L. ;
Buell, David A. ;
Burkett, Brian ;
Chen, Yu ;
Chen, Zijun ;
Chiaro, Ben ;
Collins, Roberto ;
Courtney, William ;
Dunsworth, Andrew ;
Farhi, Edward ;
Foxen, Brooks ;
Fowler, Austin ;
Gidney, Craig ;
Giustina, Marissa ;
Graff, Rob ;
Guerin, Keith ;
Habegger, Steve ;
Harrigan, Matthew P. ;
Hartmann, Michael J. ;
Ho, Alan ;
Hoffmann, Markus ;
Huang, Trent ;
Humble, Travis S. ;
Isakov, Sergei V. ;
Jeffrey, Evan ;
Jiang, Zhang ;
Kafri, Dvir ;
Kechedzhi, Kostyantyn ;
Kelly, Julian ;
Klimov, Paul V. ;
Knysh, Sergey ;
Korotkov, Alexander ;
Kostritsa, Fedor ;
Landhuis, David ;
Lindmark, Mike ;
Lucero, Erik ;
Lyakh, Dmitry ;
Mandra, Salvatore ;
McClean, Jarrod R. ;
McEwen, Matthew ;
Megrant, Anthony ;
Mi, Xiao .
NATURE, 2019, 574 (7779) :505-+
[9]  
Bausch J., 2020, Recurrent Quantum Neural Networks, V33
[10]   Training deep quantum neural networks [J].
Beer, Kerstin ;
Bondarenko, Dmytro ;
Farrelly, Terry ;
Osborne, Tobias J. ;
Salzmann, Robert ;
Scheiermann, Daniel ;
Wolf, Ramona .
NATURE COMMUNICATIONS, 2020, 11 (01)