VARIANCE-BASED SENSITIVITY OF BAYESIAN INVERSE PROBLEMS TO THE PRIOR DISTRIBUTION

被引:0
作者
Darges, John E. [1 ]
Alexanderian, Alen [1 ]
Gremaud, Pierre A. [1 ,2 ]
机构
[1] North Carolina State Univ, Dept Math, Raleigh, NC 27607 USA
[2] North Carolina State Univ, Grad Sch, Raleigh, NC 27607 USA
基金
美国国家科学基金会;
关键词
prior hyperparameters; global sensitivity analysis; Sobol' indices; Bayesian inverse problems; importance sampling; surrogate modeling; MONTE-CARLO; COMPUTATION; LIKELIHOOD; INDEXES; MOMENTS; DESIGN; MODELS; MCMC;
D O I
10.1615/Int.J.UncertaintyQuantification.2024051475
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
The formulation of Bayesian inverse problems involves choosing prior distributions; choices that seem equally reasonable may lead to significantly different conclusions. We develop a computational approach to understand the impact of the hyperparameters defining the prior on the posterior statistics of the quantities of interest. Our approach relies on global sensitivity analysis (GSA) of Bayesian inverse problems with respect to the prior hyperparameters. This, however, is a challenging problem-a naive double loop sampling approach would require running a prohibitive number of Markov chain Monte Carlo (MCMC) sampling procedures. The present work takes a foundational step in making such a sensitivity analysis practical by combining efficient surrogate models and a tailored importance sampling approach. In particular, we can perform accurate GSA of posterior statistics of quantities of interest with respect to prior hyperparameters without the need to repeat MCMC runs. We demonstrate the effectiveness of the approach on a simple Bayesian linear inverse problem and a nonlinear inverse problem governed by an epidemiological model.
引用
收藏
页码:65 / 90
页数:26
相关论文
共 72 条
[21]  
Ghanem R., 2017, iHandbook of Uncertainty Quantification, P827
[22]  
Giordano R, 2018, J MACH LEARN RES, V19
[23]   DRAM: Efficient adaptive MCMC [J].
Haario, Heikki ;
Laine, Marko ;
Mira, Antonietta ;
Saksman, Eero .
STATISTICS AND COMPUTING, 2006, 16 (04) :339-354
[24]   HYPERDIFFERENTIAL SENSITIVITY ANALYSIS OF UNCERTAIN PARAMETERS IN PDE-CONSTRAINED OPTIMIZATION [J].
Hart, Joseph ;
Waanders, Bart van Bloemen ;
Herzog, Roland .
INTERNATIONAL JOURNAL FOR UNCERTAINTY QUANTIFICATION, 2020, 10 (03) :225-248
[25]   Generalization bounds for sparse random feature expansions ? [J].
Hashemi, Abolfazl ;
Schaeffer, Hayden ;
Shi, Robert ;
Topcu, Ufuk ;
Tran, Giang ;
Ward, Rachel .
APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2023, 62 :310-330
[26]   The mathematics of infectious diseases [J].
Hethcote, HW .
SIAM REVIEW, 2000, 42 (04) :599-653
[27]   SENSITIVITY OF A BAYESIAN-ANALYSIS TO THE PRIOR DISTRIBUTION [J].
HILL, SD ;
SPALL, JC .
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS, 1994, 24 (02) :216-221
[28]   MOMENTS AND CUMULANTS OF THE MULTIVARIATE NORMAL-DISTRIBUTION [J].
HOLMQUIST, B .
STOCHASTIC ANALYSIS AND APPLICATIONS, 1988, 6 (03) :273-278
[29]   Extreme learning machine: Theory and applications [J].
Huang, Guang-Bin ;
Zhu, Qin-Yu ;
Siew, Chee-Kheong .
NEUROCOMPUTING, 2006, 70 (1-3) :489-501
[30]   Extreme learning machines: a survey [J].
Huang, Guang-Bin ;
Wang, Dian Hui ;
Lan, Yuan .
INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2011, 2 (02) :107-122