Accelerating Bayesian Inference on Structured Graphs Using Parallel Gibbs Sampling

被引:4
|
作者
Ko, Glenn G. [1 ]
Chai, Yuji [1 ]
Rutenbar, Rob A. [2 ]
Brooks, David [1 ]
Wei, Gu-Yeon [1 ]
机构
[1] Harvard Univ, Cambridge, MA 02138 USA
[2] Univ Pittsburgh, Pittsburgh, PA USA
来源
2019 29TH INTERNATIONAL CONFERENCE ON FIELD-PROGRAMMABLE LOGIC AND APPLICATIONS (FPL) | 2019年
关键词
bayesian inference; markov chain monte carlo; gibbs sampling; hardware accelerator; markov random field;
D O I
10.1109/FPL.2019.00033
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Bayesian models and inference is a class of machine learning that is useful for solving problems where the amount of data is scarce and prior knowledge about the application allows you to draw better conclusions. However, Bayesian models often requires computing high-dimensional integrals and finding the posterior distribution can be intractable. One of the most commonly used approximate methods for Bayesian inference is Gibbs sampling, which is a Markov chain Monte Carlo (MCMC) technique to estimate target stationary distribution. The idea in Gibbs sampling is to generate posterior samples by iterating through each of the variables to sample from its conditional given all the other variables fixed. While Gibbs sampling is a popular method for probabilistic graphical models such as Markov Random Field (MRF), the plain algorithm is slow as it goes through each of the variables sequentially. In this work, we describe a binary label MRF Gibbs sampling inference architecture and extend it to 64-label version capable of running multiple perceptual applications, such as sound source separation and stereo matching. The described accelerator employs a chromatic scheduling of variables to parallelize all the conditionally independent variables to 257 samplers, implemented on the FPGA portion of a CPU-FPGA SoC. For real-time streaming sound source separation task, we show the hybrid CPU-FPGA implementation is 230x faster than a commercial mobile processor, while maintaining a recommended latency under 50 ms. The 64-label version showed 137x and 679x speedups for binary label MRF Gibbs sampling inference and 64 labels, respectively.
引用
收藏
页码:159 / 165
页数:7
相关论文
共 50 条
  • [31] Improving Statistical Machine Translation Using Bayesian Word Alignment and Gibbs Sampling
    Mermer, Coskun
    Saraclar, Murat
    Sarikaya, Ruhi
    IEEE TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2013, 21 (05): : 1090 - 1101
  • [32] Stochastic approximation Monte Carlo Gibbs sampling for structural change inference in a Bayesian heteroscedastic time series model
    Kim, Jaehee
    Cheon, Sooyoung
    JOURNAL OF APPLIED STATISTICS, 2014, 41 (10) : 2157 - 2177
  • [33] Bayesian Inference of (Co) Variance Components and Genetic Parameters for Economic Traits in Iranian Holsteins via Gibbs Sampling
    Faraji-Arough, H.
    Aslaminejad, A. A.
    Tahmoorespur, M.
    Rokouei, M.
    Shariati, M. M.
    IRANIAN JOURNAL OF APPLIED ANIMAL SCIENCE, 2015, 5 (01): : 51 - 60
  • [34] Parallel algorithms for Bayesian inference in spatial Gaussian models
    Whiley, M
    Wilson, SP
    COMPSTAT 2002: PROCEEDINGS IN COMPUTATIONAL STATISTICS, 2002, : 485 - 490
  • [35] EFFICIENT SPARSE BAYESIAN LEARNING VIA GIBBS SAMPLING
    Tan, Xing
    Li, Jian
    Stoica, Peter
    2010 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2010, : 3634 - 3637
  • [36] Approximate blocked Gibbs sampling for Bayesian neural networks
    Papamarkou, Theodore
    STATISTICS AND COMPUTING, 2023, 33 (05)
  • [37] Approximate blocked Gibbs sampling for Bayesian neural networks
    Theodore Papamarkou
    Statistics and Computing, 2023, 33
  • [38] Bayesian Exploratory Factor Analysis via Gibbs Sampling
    Quintero, Adrian
    Lesaffre, Emmanuel
    Verbeke, Geert
    JOURNAL OF EDUCATIONAL AND BEHAVIORAL STATISTICS, 2024, 49 (01) : 121 - 142
  • [39] A new economical design of acceptance sampling models using Bayesian inference
    Mohammad Saber Fallahnezhad
    Muhammad Aslam
    Accreditation and Quality Assurance, 2013, 18 : 187 - 195
  • [40] An efficient sampling technique for Bayesian inference with computationally demanding models
    Reichert, P
    Schervish, M
    Small, MJ
    TECHNOMETRICS, 2002, 44 (04) : 318 - 327