Hamiltonian Monte Carlo acceleration using surrogate functions with random bases

被引:19
作者
Zhang, Cheng [1 ]
Shahbaba, Babak [2 ]
Zhao, Hongkai [1 ]
机构
[1] Univ Calif Irvine, Dept Math, Irvine, CA 92697 USA
[2] Univ Calif Irvine, Dept Stat, Irvine, CA 92697 USA
基金
美国国家科学基金会;
关键词
Markov chain Monte Carlo; Hamiltonian dynamics; Surrogate method; Random bases; PSEUDOINVERSE; APPROXIMATION; MACHINE;
D O I
10.1007/s11222-016-9699-1
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
For big data analysis, high computational cost for Bayesian methods often limits their applications in practice. In recent years, there have been many attempts to improve computational efficiency of Bayesian inference. Here we propose an efficient and scalable computational technique for a state-of-the-art Markov chain Monte Carlo methods, namely, Hamiltonian Monte Carlo. The key idea is to explore and exploit the structure and regularity in parameter space for the underlying probabilistic model to construct an effective approximation of its geometric properties. To this end, we build a surrogate function to approximate the target distribution using properly chosen random bases and an efficient optimization process. The resulting method provides a flexible, scalable, and efficient sampling algorithm, which converges to the correct target distribution. We show that by choosing the basis functions and optimization process differently, our method can be related to other approaches for the construction of surrogate functions such as generalized additive models or Gaussian process models. Experiments based on simulated and real data show that our approach leads to substantially more efficient sampling algorithms compared to existing state-of-the-art methods.
引用
收藏
页码:1473 / 1490
页数:18
相关论文
共 44 条
  • [1] Amari S.I., 2000, Methods of Information Geometry, DOI DOI 10.1090/MMONO/191
  • [2] A tutorial on adaptive MCMC
    Andrieu, Christophe
    Thoms, Johannes
    [J]. STATISTICS AND COMPUTING, 2008, 18 (04) : 343 - 373
  • [3] [Anonymous], 2013, P 16 INT C ART INT S
  • [4] Bache K., 2013, UCI Machine Learning Repository
  • [5] Betancourt M., 2015, P 31 INT C MACH LEAR
  • [6] Chen Tianqi, 2014, STOCHASTIC GRADIENT
  • [7] Conard P.R., 2014, ARXIV14021694V3
  • [8] UNCERTAINTY QUANTIFICATION AND WEAK APPROXIMATION OF AN ELLIPTIC INVERSE PROBLEM
    Dashti, M.
    Stuart, A. M.
    [J]. SIAM JOURNAL ON NUMERICAL ANALYSIS, 2011, 49 (06) : 2524 - 2542
  • [9] HYBRID MONTE-CARLO
    DUANE, S
    KENNEDY, AD
    PENDLETON, BJ
    ROWETH, D
    [J]. PHYSICS LETTERS B, 1987, 195 (02) : 216 - 222
  • [10] STOCHASTIC RELAXATION, GIBBS DISTRIBUTIONS, AND THE BAYESIAN RESTORATION OF IMAGES
    GEMAN, S
    GEMAN, D
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1984, 6 (06) : 721 - 741