Learning Bounded Tree-Width Bayesian Networks via Sampling

被引:10
作者
Nie, Siqi [1 ]
de Campos, Cassio P. [2 ]
Ji, Qiang [1 ]
机构
[1] Rensselaer Polytech Inst, Dept Elect Comp & Syst Engn, Troy, NY 12180 USA
[2] Queens Univ Belfast, Sch Elect Elect Engn & Comp Sci, Belfast, Antrim, North Ireland
来源
SYMBOLIC AND QUANTITATIVE APPROACHES TO REASONING WITH UNCERTAINTY, ECSQARU 2015 | 2015年 / 9161卷
关键词
Bayesian network; Structure learning; Bounded tree-width; TREEWIDTH;
D O I
10.1007/978-3-319-20807-7_35
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Learning Bayesian networks with bounded tree-width has attracted much attention recently, because low tree-width allows exact inference to be performed efficiently. Some existing methods [12,14] tackle the problem by using k-trees to learn the optimal Bayesian network with tree-width up to k. In this paper, we propose a sampling method to efficiently find representative k-trees by introducing an Informative score function to characterize the quality of a k-tree. The proposed algorithm can efficiently learn a Bayesian network with tree-width at most k. Experiment results indicate that our approach is comparable with exact methods, but is much more computationally efficient.
引用
收藏
页码:387 / 396
页数:10
相关论文
共 17 条
  • [1] COMPLEXITY OF FINDING EMBEDDINGS IN A K-TREE
    ARNBORG, S
    CORNEIL, DG
    PROSKUROWSKI, A
    [J]. SIAM JOURNAL ON ALGEBRAIC AND DISCRETE METHODS, 1987, 8 (02): : 277 - 284
  • [2] Bache K., 2013, UCI Machine Learning Repository
  • [3] Berg J, 2014, JMLR WORKSH CONF PRO, V33, P86
  • [4] Buntine W., 1991, P 17 C UNCERTAINTY A, P52
  • [5] Bijective Linear Time Coding and Decoding for k-Trees
    Caminiti, Saverio
    Fusco, Emanuele G.
    Petreschi, Rossella
    [J]. THEORY OF COMPUTING SYSTEMS, 2010, 46 (02) : 284 - 300
  • [6] A BAYESIAN METHOD FOR THE INDUCTION OF PROBABILISTIC NETWORKS FROM DATA
    COOPER, GF
    HERSKOVITS, E
    [J]. MACHINE LEARNING, 1992, 9 (04) : 309 - 347
  • [7] de Campos CP, 2011, J MACH LEARN RES, V12, P663
  • [8] Eaton D, 2007, P 23 C UNC ART INT, P101
  • [9] Elidan G, 2008, J MACH LEARN RES, V9, P2699
  • [10] HECKERMAN D, 1995, MACH LEARN, V20, P197, DOI 10.1007/BF00994016