Theoretical analysis and practical insights on importance sampling in Bayesian networks

被引:10
作者
Yuan, Changhe [1 ]
Druzdzel, Marek J.
机构
[1] Mississippi State Univ, Dept Comp Sci & Engn, Mississippi State, MS 39762 USA
[2] Univ Pittsburgh, Decis Syst Lab, Intelligent Syst Program, Pittsburgh, PA 15260 USA
[3] Univ Pittsburgh, Sch Informat Sci, Pittsburgh, PA 15260 USA
关键词
D O I
10.1016/j.ijar.2006.09.006
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The AIS-BN algorithm [J. Cheng, M.J. Druzdzel, BN-AIS: An adaptive importance sampling algorithm for evidential reasoning in large Bayesian networks, Journal of Artificial Intelligence Research 13 (2000) 155-188] is a successful importance sampling-based algorithm for Bayesian networks that relies on two heuristic methods to obtain an initial importance function: epsilon-cutoff, replacing small probabilities in the conditional probability tables by a larger E, and setting the probability distributions of the parents of evidence nodes to uniform. However, why the simple heuristics are so effective was not well understood. In this paper, we point out that it is due to a practical requirement for the importance function, which says that a good importance function should possess thicker tails than the actual posterior probability distribution. By studying the basic assumptions behind importance sampling and the properties of importance sampling in Bayesian networks, we develop several theoretical insights into the desirability of thick tails for importance functions. These insights not only shed light on the success of the two heuristics of AIS-BN, but also provide a common theoretical basis for several other successful heuristic methods. (C) 2006 Elsevier Inc. All rights reserved.
引用
收藏
页码:320 / 333
页数:14
相关论文
共 25 条
[1]  
ANDRIEU C, 2003, MACH LEARN, V350, P4
[2]  
[Anonymous], 1998, INTRO MONTE CARLO ME
[3]  
[Anonymous], 1998, An introduction to variational methods for graphical models
[4]   AIS-BN: An adaptive importance sampling algorithm for evidential reasoning in large Bayesian networks [J].
Cheng, J ;
Druzdzel, MJ .
JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH, 2000, 13 :155-188
[5]   THE COMPUTATIONAL-COMPLEXITY OF PROBABILISTIC INFERENCE USING BAYESIAN BELIEF NETWORKS [J].
COOPER, GF .
ARTIFICIAL INTELLIGENCE, 1990, 42 (2-3) :393-405
[6]   APPROXIMATING PROBABILISTIC INFERENCE IN BAYESIAN BELIEF NETWORKS IS NP-HARD [J].
DAGUM, P ;
LUBY, M .
ARTIFICIAL INTELLIGENCE, 1993, 60 (01) :141-153
[7]  
DRUZDZEL MJ, 1994, P 10 C UNC ART INT U, P187
[8]  
Fung R., 1989, Proceedings of the 5th Conference on Uncertainty in Artificial Intelligence, P209
[9]  
Fung R., 1994, P 10 C UNC ART INT S, P227
[10]   BAYESIAN-INFERENCE IN ECONOMETRIC-MODELS USING MONTE-CARLO INTEGRATION [J].
GEWEKE, J .
ECONOMETRICA, 1989, 57 (06) :1317-1339