Bayesian model search for mixture models based on optimizing variational bounds

被引:90
作者
Ueda, N
Ghahramani, Z
机构
[1] NTT Corp, Commun Sci Lab, Kyoto 6190237, Japan
[2] UCL, Gatsby Computat Neurosci Unit, London WC1N 3AR, England
关键词
variational Bayes; Bayesian model search; mixture models; mixture of experts models; EM algorithm;
D O I
10.1016/S0893-6080(02)00040-0
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
When learning a mixture model, we suffer from the local optima and model structure determination problems. In this paper, we present a method for simultaneously solving these problems based on the variational Bayesian (VB) framework. First, in the VB framework, we derive an objective function that can simultaneously optimize both model parameter distributions and model structure. Next, focusing on mixture models, we present a deterministic algorithm to approximately optimize the objective function by using the idea of the split and merge operations which we previously proposed within the maximum likelihood framework. Then, we apply the method to mixture of expers (MoE) models to experimentally show that the proposed method can find the optimal number of experts of a MoE while avoiding local maxima. (C) 2002 Elsevier Science Ltd. All rights reserved.
引用
收藏
页码:1223 / 1241
页数:19
相关论文
共 25 条
[1]   NEW LOOK AT STATISTICAL-MODEL IDENTIFICATION [J].
AKAIKE, H .
IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 1974, AC19 (06) :716-723
[2]  
[Anonymous], 1997, THESIS
[3]  
ATTIAS H, 1999, P UNC ART INT UAI
[4]  
Bernardo J.M., 2009, Bayesian Theory, V405
[5]  
Bishop CM, 1999, ADV NEUR IN, V11, P382
[6]   MAXIMUM LIKELIHOOD FROM INCOMPLETE DATA VIA EM ALGORITHM [J].
DEMPSTER, AP ;
LAIRD, NM ;
RUBIN, DB .
JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-METHODOLOGICAL, 1977, 39 (01) :1-38
[7]  
FUEDA N, 1999, ADV NEURAL INFORMATI, P599
[8]  
Gamerman D., 1997, MARKOV CHAIN MONTE C
[9]  
Ghahramani Z, 2001, ADV NEUR IN, V13, P507
[10]  
Ghahramani Z, 2000, ADV NEUR IN, V12, P449