The Connection between Bayesian Inference and Information Theory for Model Selection, Information Gain and Experimental Design

被引:18
|
作者
Oladyshkin, Sergey [1 ]
Nowak, Wolfgang [1 ]
机构
[1] Univ Stuttgart, Dept Stochast Simulat & Safety Res Hydrosyst, Inst Modelling Hydraul & Environm Syst SC SimTech, Pfaffenwaldring 5a, D-70569 Stuttgart, Germany
关键词
model evidence; entropy; model selection; information entropy; Bayesian experimental design; Kullback-Leibler divergence; Markov chain Monte Carlo; Monte Carlo; MARGINAL LIKELIHOOD; CRITERION; CHOICE;
D O I
10.3390/e21111081
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
We show a link between Bayesian inference and information theory that is useful for model selection, assessment of information entropy and experimental design. We align Bayesian model evidence (BME) with relative entropy and cross entropy in order to simplify computations using prior-based (Monte Carlo) or posterior-based (Markov chain Monte Carlo) BME estimates. On the one hand, we demonstrate how Bayesian model selection can profit from information theory to estimate BME values via posterior-based techniques. Hence, we use various assumptions including relations to several information criteria. On the other hand, we demonstrate how relative entropy can profit from BME to assess information entropy during Bayesian updating and to assess utility in Bayesian experimental design. Specifically, we emphasize that relative entropy can be computed avoiding unnecessary multidimensional integration from both prior and posterior-based sampling techniques. Prior-based computation does not require any assumptions, however posterior-based estimates require at least one assumption. We illustrate the performance of the discussed estimates of BME, information entropy and experiment utility using a transparent, non-linear example. The multivariate Gaussian posterior estimate includes least assumptions and shows the best performance for BME estimation, information entropy and experiment utility from posterior-based sampling.
引用
收藏
页数:19
相关论文
共 50 条
  • [1] Model Selection and Psychological Theory: A Discussion of the Differences Between the Akaike Information Criterion (AIC) and the Bayesian Information Criterion (BIC)
    Vrieze, Scott I.
    PSYCHOLOGICAL METHODS, 2012, 17 (02) : 228 - 243
  • [2] Fast Bayesian experimental design: Laplace-based importance sampling for the expected information gain
    Beck, Joakim
    Dia, Ben Mansour
    Espath, Luis F. R.
    Long, Quan
    Tempone, Raul
    COMPUTER METHODS IN APPLIED MECHANICS AND ENGINEERING, 2018, 334 : 523 - 553
  • [3] Entropy, Information Theory, Information Geometry and Bayesian Inference in Data, Signal and Image Processing and Inverse Problems
    Mohammad-Djafari, Ali
    ENTROPY, 2015, 17 (06) : 3989 - 4027
  • [4] Bayesian or Laplacien inference, Entropy and Information theory and Information Geometry in data and signal processing
    Mohammad-Djafari, Ali
    BAYESIAN INFERENCE AND MAXIMUM ENTROPY METHODS IN SCIENCE AND ENGINEERING (MAXENT 2014), 2015, 1641 : 43 - 58
  • [5] A review of Information Field Theory for Bayesian inference of random fields
    Pandey, Aditya
    Singh, Ashmeet
    Gardoni, Paolo
    STRUCTURAL SAFETY, 2022, 99
  • [6] Bayesian bin distribution inference and mutual information
    Endres, D
    Földiák, P
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2005, 51 (11) : 3766 - 3779
  • [7] On Model Selection, Bayesian Networks, and the Fisher Information Integral
    Yuan Zou
    Teemu Roos
    New Generation Computing, 2017, 35 : 5 - 27
  • [8] Approximated Information Analysis in Bayesian Inference
    Seo, Jung In
    Kim, Yongku
    ENTROPY, 2015, 17 (03) : 1441 - 1451
  • [9] On Model Selection, Bayesian Networks, and the Fisher Information Integral
    Zou, Yuan
    Roos, Teemu
    NEW GENERATION COMPUTING, 2017, 35 (01) : 5 - 27
  • [10] Improved Bayesian information criterion for mixture model selection
    Mehrjou, Arash
    Hosseini, Reshad
    Araabi, Babak Nadjar
    PATTERN RECOGNITION LETTERS, 2016, 69 : 22 - 27