Asymptotic Consistency of α-Renyi-Approximate Posteriors

被引:0
|
作者
Jaiswal, Prateek [1 ]
Rao, Vinayak [2 ]
Honnappa, Harsha [1 ]
机构
[1] Purdue Univ, Sch Ind Engn, W Lafayette, IN 47907 USA
[2] Purdue Univ, Dept Stat, W Lafayette, IN 47907 USA
基金
美国国家科学基金会;
关键词
alpha-Renyi divergence; Asymptotic consistency; Bayesian computation; Variational inference; VARIATIONAL INFERENCE;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We study the asymptotic consistency properties of alpha-Renyi approximate posteriors, a class of variational Bayesian methods that approximate an intractable Bayesian posterior with a member of a tractable family of distributions, the member chosen to minimize the alpha-Renyi divergence from the true posterior. Unique to our work is that we consider settings with alpha > 1, resulting in approximations that upperbound the log-likelihood, and consequently have wider spread than traditional variational approaches that minimize the Kullback-Liebler (KL) divergence from the posterior. Our primary result identifies sufficient conditions under which consistency holds, centering around the existence of a 'good' sequence of distributions in the approximating family that possesses, among other properties, the right rate of convergence to a limit distribution. We further characterize the good sequence by demonstrating that a sequence of distributions that converges too quickly cannot be a good sequence. We also extend our analysis to the setting where alpha equals one, corresponding to the minimizer of the reverse KL divergence, and to models with local latent variables. We also illustrate the existence of good sequence with a number of examples. Our results complement a growing body of work focused on the frequentist properties of variational Bayesian methods.
引用
收藏
页数:42
相关论文
共 3 条