DiBS: Differentiable Bayesian Structure Learning

被引:0
|
作者
Lorch, Lars [1 ]
Rothfuss, Jonas [1 ]
Schoelkopf, Bernhard [2 ]
Krause, Andreas [1 ]
机构
[1] Swiss Fed Inst Technol, Zurich, Switzerland
[2] MPI Intelligent Syst, Tubingen, Germany
来源
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021) | 2021年 / 34卷
基金
瑞士国家科学基金会; 欧洲研究理事会;
关键词
MARKOV EQUIVALENCE CLASSES; GRAPHICAL MODELS; STRUCTURE DISCOVERY; NETWORK STRUCTURE;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Bayesian structure learning allows inferring Bayesian network structure from data while reasoning about the epistemic uncertainty-a key element towards enabling active causal discovery and designing interventions in real world systems. In this work, we propose a general, fully differentiable framework for Bayesian structure learning (DiBS) that operates in the continuous space of a latent probabilistic graph representation. Contrary to existing work, DiBS is agnostic to the form of the local conditional distributions and allows for joint posterior inference of both the graph structure and the conditional distribution parameters. This makes our formulation directly applicable to posterior inference of complex Bayesian network models, e.g., with nonlinear dependencies encoded by neural networks. Using DiBS, we devise an efficient, general purpose variational inference method for approximating distributions over structural models. In evaluations on simulated and real-world data, our method significantly outperforms related approaches to joint posterior inference.
引用
收藏
页数:13
相关论文
empty
未找到相关数据