A Deep Learning Method for Comparing Bayesian Hierarchical Models

被引:1
作者
Elsemueller, Lasse [1 ]
Schnuerch, Martin [2 ]
Buerkner, Paul-Christian [3 ]
Radev, Stefan T. [4 ]
机构
[1] Heidelberg Univ, Inst Psychol, Hauptstr 47, D-69117 Heidelberg, Germany
[2] Univ Mannheim, Dept Psychol, Mannheim, Germany
[3] TU Dortmund Univ, Dept Stat, Dortmund, Germany
[4] Heidelberg Univ, Cluster Excellence STRUCTURES, Heidelberg, Germany
关键词
Bayesian statistics; model comparison; hierarchical modeling; deep learning; cognitive modeling; PROCESSING TREE MODELS; MONTE-CARLO; NORMALIZING CONSTANTS; CHOICE;
D O I
10.1037/met0000645
中图分类号
B84 [心理学];
学科分类号
04 ; 0402 ;
摘要
Bayesian model comparison (BMC) offers a principled approach to assessing the relative merits of competing computational models and propagating uncertainty into model selection decisions. However, BMC is often intractable for the popular class of hierarchical models due to their high-dimensional nested parameter structure. To address this intractability, we propose a deep learning method for performing BMC on any set of hierarchical models which can be instantiated as probabilistic programs. Since our method enables amortized inference, it allows efficient re-estimation of posterior model probabilities and fast performance validation prior to any real-data application. In a series of extensive validation studies, we benchmark the performance of our method against the state-of-the-art bridge sampling method and demonstrate excellent amortized inference across all BMC settings. We then showcase our method by comparing four hierarchical evidence accumulation models that have previously been deemed intractable for BMC due to partly implicit likelihoods. Additionally, we demonstrate how transfer learning can be leveraged to enhance training efficiency. We provide reproducible code for all analyses and an open-source implementation of our method.
引用
收藏
页数:30
相关论文
共 110 条
  • [1] Abadi M., 2016, arXiv, DOI [10.48550/arXiv.1603.04467., DOI 10.48550/ARXIV.1603.04467]
  • [2] Barron A, 1999, ANN STAT, V27, P536
  • [3] Approximate Bayesian Computation in Evolution and Ecology
    Beaumont, Mark A.
    [J]. ANNUAL REVIEW OF ECOLOGY, EVOLUTION, AND SYSTEMATICS, VOL 41, 2010, 41 : 379 - 406
  • [4] Bengio Yoshua, 2009, P 26 ANN INT C MACHI, P41
  • [5] EFFICIENT ESTIMATION OF FREE-ENERGY DIFFERENCES FROM MONTE-CARLO DATA
    BENNETT, CH
    [J]. JOURNAL OF COMPUTATIONAL PHYSICS, 1976, 22 (02) : 245 - 268
  • [6] Bernardo JM., 1994, Bayesian theory, DOI [10.1002/9780470316870, DOI 10.1002/9780470316870]
  • [7] Julia: A Fresh Approach to Numerical Computing
    Bezanson, Jeff
    Edelman, Alan
    Karpinski, Stefan
    Shah, Viral B.
    [J]. SIAM REVIEW, 2017, 59 (01) : 65 - 98
  • [8] Bloem-Reddy B, 2020, J MACH LEARN RES, V21
  • [9] Estimating across-trial variability parameters of the Diffusion Decision Model: Expert advice and recommendations
    Boehm, Udo
    Annis, Jeffrey
    Frank, Michael J.
    Hawkins, Guy E.
    Heathcote, Andrew
    Kellen, David
    Krypotos, Angelos-Miltiadis
    Lerche, Veronika
    Logan, Gordon D.
    Palmeri, Thomas J.
    van Ravenzwaaij, Don
    Servant, Mathieu
    Singmann, Henrik
    Starns, Jeffrey J.
    Voss, Andreas
    Wiecki, Thomas V.
    Matzke, Dora
    Wagenmakers, Eric-Jan
    [J]. JOURNAL OF MATHEMATICAL PSYCHOLOGY, 2018, 87 : 46 - 75
  • [10] Bürkner PC, 2023, Arxiv, DOI [arXiv:2209.02439, 10.48550/arXiv.2209.02439, DOI 10.48550/ARXIV.2209.02439]