共 50 条
Priors in Bayesian Deep Learning: A Review
被引:38
作者:
Fortuin, Vincent
[1
]
机构:
[1] Swiss Fed Inst Technol, Dept Comp Sci, Zurich, Switzerland
关键词:
Bayesian deep learning;
Bayesian learning;
deep learning;
priors;
NEURAL-NETWORKS;
BACKPROPAGATION;
INFERENCE;
MIXTURES;
D O I:
10.1111/insr.12502
中图分类号:
O21 [概率论与数理统计];
C8 [统计学];
学科分类号:
020208 ;
070103 ;
0714 ;
摘要:
While the choice of prior is one of the most critical parts of the Bayesian inference workflow, recent Bayesian deep learning models have often fallen back on vague priors, such as standard Gaussians. In this review, we highlight the importance of prior choices for Bayesian deep learning and present an overview of different priors that have been proposed for (deep) Gaussian processes, variational autoencoders and Bayesian neural networks. We also outline different methods of learning priors for these models from data. We hope to motivate practitioners in Bayesian deep learning to think more carefully about the prior specification for their models and to provide them with some inspiration in this regard.
引用
收藏
页码:563 / 591
页数:29
相关论文
共 50 条