Amortized Monte Carlo Integration

被引:0
作者
Golinski, Adam [1 ,2 ]
Wood, Frank [3 ]
Rainforth, Tom [1 ]
机构
[1] Univ Oxford, Dept Stat, Oxford, England
[2] Univ Oxford, Dept Engn Sci, Oxford, England
[3] Univ British Columbia, Dept Comp Sci, Vancouver, BC, Canada
来源
INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97 | 2019年 / 97卷
基金
欧洲研究理事会; 英国工程与自然科学研究理事会; 加拿大自然科学与工程研究理事会;
关键词
NORMALIZING CONSTANTS; TUMOR-GROWTH; RATIOS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Current approaches to amortizing Bayesian inference focus solely on approximating the posterior distribution. Typically, this approximation is, in turn, used to calculate expectations for one or more target functions-a computational pipeline which is inefficient when the target function(s) are known upfront. In this paper, we address this inefficiency by introducing AMCI, a method for amortizing Monte Carlo integration directly. AMCI operates similarly to amortized inference but produces three distinct amortized proposals, each tailored to a different component of the overall expectation calculation. At runtime, samples are produced separately from each amortized proposal, before being combined to an overall estimate of the expectation. We show that while existing approaches are fundamentally limited in the level of accuracy they can achieve, AMCI can theoretically produce arbitrarily small errors for any integrable target function using only a single sample from each proposal at runtime. We further show that it is able to empirically outperform the theoretically optimal self-normalized importance sampler on a number of example problems. Furthermore, AMCI allows not only for amortizing over datasets but also amortizing over target functions.
引用
收藏
页数:10
相关论文
共 35 条
[1]  
[Anonymous], 2007, BAYESIAN CHOICE DECI
[2]  
[Anonymous], 1988, Advances in importance sampling
[3]   Adaptive Importance Sampling The past, the present, and the future [J].
Bugallo, Monica F. ;
Elvira, Victor ;
Martino, Luca ;
Luengo, David ;
Miguez, Joaquin ;
Djuric, Petar M. .
IEEE SIGNAL PROCESSING MAGAZINE, 2017, 34 (04) :60-79
[4]  
Chen MH, 1997, ANN STAT, V25, P1563
[5]  
Cremer C, 2018, PR MACH LEARN RES, V80
[6]  
Enderling H, 2014, CURR PHARM DESIGN, V20, P4934
[7]   Methods for approximating integrals in statistics with special emphasis on Bayesian integration problems [J].
Evans, M ;
Swartz, T .
STATISTICAL SCIENCE, 1995, 10 (03) :254-272
[8]  
Gelman A, 1998, STAT SCI, V13, P163
[9]  
Grathwohl Will, 2019, 7 INT C LEARN REPR I
[10]  
Hahnfeldt P, 1999, CANCER RES, V59, P4770