Adiabatic Quantum Computation Applied to Deep Learning Networks

被引:13
作者
Liu, Jeremy [1 ,2 ]
Spedalieri, Federico M. [2 ,3 ]
Yao, Ke-Thia [2 ]
Potok, Thomas E. [4 ]
Schuman, Catherine [4 ]
Young, Steven [4 ]
Patton, Robert [4 ]
Rose, Garrett S. [5 ]
Chamka, Gangotree [5 ]
机构
[1] Univ Southern Calif, Dept Comp Sci, Los Angeles, CA 90089 USA
[2] Univ Southern Calif, Inst Informat Sci, Marina Del Rey, CA 90292 USA
[3] Univ Southern Calif, Dept Elect Engn, Los Angeles, CA 90089 USA
[4] Oak Ridge Natl Lab, Computat Data Analyt Grp, Oak Ridge, TN 37830 USA
[5] Univ Tennessee, Dept Elect Engn & Comp Sci, Knoxville, TN 37996 USA
关键词
deep learning; quantum computing; neuromorphic computing; high performance computing; ALGORITHM;
D O I
10.3390/e20050380
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
Training deep learning networks is a difficult task due to computational complexity, and this is traditionally handled by simplifying network topology to enable parallel computation on graphical processing units (GPUs). However, the emergence of quantum devices allows reconsideration of complex topologies. We illustrate a particular network topology that can be trained to classify MNIST data (an image dataset of handwritten digits) and neutrino detection data using a restricted form of adiabatic quantum computation known as quantum annealing performed by a D-Wave processor. We provide a brief description of the hardware and how it solves Ising models, how we translate our data into the corresponding Ising models, and how we use available expanded topology options to explore potential performance improvements. Although we focus on the application of quantum annealing in this article, the work discussed here is just one of three approaches we explored as part of a larger project that considers alternative means for training deep learning networks. The other approaches involve using a high performance computing (HPC) environment to automatically find network topologies with good performance and using neuromorphic computing to find a low-power solution for training deep learning networks. Our results show that our quantum approach can find good network parameters in a reasonable time despite increased network topology complexity; that HPC can find good parameters for traditional, simplified network topologies; and that neuromorphic computers can use low power memristive hardware to represent complex topologies and parameters derived from other architecture choices.
引用
收藏
页数:28
相关论文
共 55 条
[1]  
ACKLEY DH, 1985, COGNITIVE SCI, V9, P147
[2]  
Adachi S.H., 2015, ARXIV151006356V1
[3]  
[Anonymous], 1997, Neural Computation
[4]  
[Anonymous], 2009, Deep boltzmann machines
[5]  
[Anonymous], 1998, The mnist database of handwritten digits
[6]  
Arthur J.V., 2012, The 2012 International Joint Conference on Neural Networks (IJCNN), P1
[7]   ON THE COMPUTATIONAL-COMPLEXITY OF ISING SPIN-GLASS MODELS [J].
BARAHONA, F .
JOURNAL OF PHYSICS A-MATHEMATICAL AND GENERAL, 1982, 15 (10) :3241-3253
[8]  
Benedetti M., 2016, ARXIV160902542V2
[9]   Estimation of effective temperatures in quantum annealers for sampling applications: A case study with possible applications in deep learning [J].
Benedetti, Marcello ;
Realpe-Gomez, John ;
Biswas, Rupak ;
Perdomo-Ortiz, Alejandro .
PHYSICAL REVIEW A, 2016, 94 (02)
[10]   Error-backpropagation in temporally encoded networks of spiking neurons [J].
Bohte, SM ;
Kok, JN ;
La Poutré, H .
NEUROCOMPUTING, 2002, 48 :17-37