Generalizing expectation propagation with mixtures of exponential family distributions and an application to Bayesian logistic regression
被引:5
作者:
Sun, Shiliang
论文数: 0引用数: 0
h-index: 0
机构:
East China Normal Univ, Dept Comp Sci & Technol, 3663 North Zhongshan Rd, Shanghai 200241, Peoples R ChinaEast China Normal Univ, Dept Comp Sci & Technol, 3663 North Zhongshan Rd, Shanghai 200241, Peoples R China
Sun, Shiliang
[1
]
He, Shaojie
论文数: 0引用数: 0
h-index: 0
机构:
East China Normal Univ, Dept Comp Sci & Technol, 3663 North Zhongshan Rd, Shanghai 200241, Peoples R ChinaEast China Normal Univ, Dept Comp Sci & Technol, 3663 North Zhongshan Rd, Shanghai 200241, Peoples R China
He, Shaojie
[1
]
机构:
[1] East China Normal Univ, Dept Comp Sci & Technol, 3663 North Zhongshan Rd, Shanghai 200241, Peoples R China
Expectation propagation (EP) is a widely used deterministic approximate inference algorithm in Bayesian machine learning. Traditional EP approximates an intractable posterior distribution through a set of local approximations which are updated iteratively. In this paper, we propose a generalized version of EP called generalized EP (GEP), which is a new method based on the minimization of KL divergence for approximate inference. However, when the variance of the gradient is large, the algorithm may need a long time to converge. We use control variates and develop a variance reduced version of this method called GEP-CV. We evaluate our approach on Bayesian logistic regression, which provides faster convergence and better performance than other state-of-the-art approaches. (C) 2019 Elsevier B.V. All rights reserved.