Approximating the Kullback Leibler Divergence between Gaussian Mixture Models

被引:711
作者
Hershey, John R.
Olsen, Peder A.
机构
来源
2007 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, VOL IV, PTS 1-3 | 2007年
关键词
Kullback Leibler divergence; variational methods; gaussian mixture models; unscented transformation;
D O I
10.1109/icassp.2007.366913
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
The Kullback Leibler (KL) Divergence is a widely used tool in statistics and pattern recognition. The KL divergence between two Gaussian Mixture Models (GMMs) is frequently needed in the fields of speech and image recognition. Unfortunately the KL divergence between two GMMs is not analytically tractable, nor does any efficient computational algorithm exist. Some techniques cope with this problem by replacing the KL divergence with other functions that can be computed efficiently. We introduce two new methods, the variational approximation and the variational upper bound, and compare them to existing methods. We discuss seven different techniques in total and weigh the benefits of each one against the others. To conclude we evaluate the performance of each one through numerical experiments.
引用
收藏
页码:317 / 320
页数:4
相关论文
共 12 条