Bayesian model search for mixture models based on optimizing variational bounds

被引:90
作者
Ueda, N
Ghahramani, Z
机构
[1] NTT Corp, Commun Sci Lab, Kyoto 6190237, Japan
[2] UCL, Gatsby Computat Neurosci Unit, London WC1N 3AR, England
关键词
variational Bayes; Bayesian model search; mixture models; mixture of experts models; EM algorithm;
D O I
10.1016/S0893-6080(02)00040-0
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
When learning a mixture model, we suffer from the local optima and model structure determination problems. In this paper, we present a method for simultaneously solving these problems based on the variational Bayesian (VB) framework. First, in the VB framework, we derive an objective function that can simultaneously optimize both model parameter distributions and model structure. Next, focusing on mixture models, we present a deterministic algorithm to approximately optimize the objective function by using the idea of the split and merge operations which we previously proposed within the maximum likelihood framework. Then, we apply the method to mixture of expers (MoE) models to experimentally show that the proposed method can find the optimal number of experts of a MoE while avoiding local maxima. (C) 2002 Elsevier Science Ltd. All rights reserved.
引用
收藏
页码:1223 / 1241
页数:19
相关论文
共 25 条
[21]  
Tipping ME, 2000, ADV NEUR IN, V12, P652
[22]   SMEM algorithm for mixture models [J].
Ueda, N ;
Nakano, R ;
Ghahramani, Z ;
Hinton, GE .
NEURAL COMPUTATION, 2000, 12 (09) :2109-2128
[23]  
UEDA N, 2000, J JAPANESE SOC ARTIF, V16
[24]  
WATERHOUSE SR, 1995, ADV NEURAL INFORM PR, V8, P351
[25]  
Xu L, 1994, ADV NEURAL INFORM PR, P633