Quantum-Inspired Structure-Preserving Probabilistic Inference

被引:2
作者
Muecke, Sascha [1 ]
Piatkowski, Nico [2 ]
机构
[1] TU Dortmund, AI Grp, Dortmund, Germany
[2] Fraunhofer IAIS, ME Grp, St Augustin, Germany
来源
2022 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC) | 2022年
关键词
probabilistic inference; quantum annealing; fpga; evolutionary computation; ISING-MODEL;
D O I
10.1109/CEC55065.2022.9870260
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Probabilistic methods serve as the underlying framework of various machine learning techniques. When using these models, a central problem is that of computing the partition function, whose computation is intractable for many models of interest. Here, we present the first quantum-inspired method that is especially designed for computing fast approximations to the partition function. Our approach uses a novel hardware solver for quadratic unconstrained binary optimization problems that relies on evolutionary computation. The specialized design allows us to assess millions of candidate solutions per second, leading to high quality maximum a-posterior (MAP) estimates, even for hard instances. We investigate the expected run-time of our solver and devise new ultra-sparse parity constraints to combine our device with the WISH approximation scheme. A SIMD-like packing strategy further allows us to solve multiple MAP instances at once, resulting in high efficiency and an additional speedup. Numerical experiments show that our quantum-inspired approach produces accurate and robust results. While pure software implementations of the WISH algorithm typically run on large compute clusters with hundreds of CPUs, our results are achieved on two FPGA boards which both consume below 10 Watts. Moreover, our results extend seamlessly to adiabatic quantum computers.
引用
收藏
页数:9
相关论文
共 34 条
[1]  
[Anonymous], 2013, International Conference on Machine Learning
[2]  
[Anonymous], 2002, Evolutionary Optimization
[3]  
[Anonymous], 2016, JMLR WORKSHOP C P
[4]  
BACK T, 1994, STAT COMPUT, V4, P51, DOI 10.1007/BF00175353
[5]   Pseudo-Boolean optimization [J].
Boros, E ;
Hammer, PL .
DISCRETE APPLIED MATHEMATICS, 2002, 123 (1-3) :155-225
[6]  
Cover T.M, 2006, Information theory and statistics: elements of information theory
[7]  
De Gloria A., 1994, Proceedings of the First International Conference on Massively Parallel Computing Systems (MPCS). The Challenges of General-Purpose and Special-Purpose Computing, P497, DOI 10.1109/MPCS.1994.367040
[8]  
Ermon S., 2013, ARXIV
[9]   STOCHASTIC RELAXATION, GIBBS DISTRIBUTIONS, AND THE BAYESIAN RESTORATION OF IMAGES [J].
GEMAN, S ;
GEMAN, D .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1984, 6 (06) :721-741
[10]   Riemann manifold Langevin and Hamiltonian Monte Carlo methods [J].
Girolami, Mark ;
Calderhead, Ben .
JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 2011, 73 :123-214