Local dimension reduction of summary statistics for likelihood-free inference

被引:0
作者
Siren, Jukka [1 ]
Kaski, Samuel [1 ]
机构
[1] Aalto Univ, Helsinki Inst Informat Technol HIIT, Dept Comp Sci, Espoo, Finland
关键词
Approximate Bayesian computation; Dimension reduction; Likelihood-free inference; Summary statistics; APPROXIMATE BAYESIAN COMPUTATION; SELECTION;
D O I
10.1007/s11222-019-09905-w
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Approximate Bayesian computation (ABC) and other likelihood-free inference methods have gained popularity in the last decade, as they allow rigorous statistical inference for complex models without analytically tractable likelihood functions. A key component for accurate inference with ABC is the choice of summary statistics, which summarize the information in the data, but at the same time should be low-dimensional for efficiency. Several dimension reduction techniques have been introduced to automatically construct informative and low-dimensional summaries from a possibly large pool of candidate summaries. Projection-based methods, which are based on learning simple functional relationships from the summaries to parameters, are widely used and usually perform well, but might fail when the assumptions behind the transformation are not satisfied. We introduce a localization strategy for any projection-based dimension reduction method, in which the transformation is estimated in the neighborhood of the observed data instead of the whole space. Localization strategies have been suggested before, but the performance of the transformed summaries outside the local neighborhood has not been guaranteed. In our localization approach the transformation is validated and optimized over validation datasets, ensuring reliable performance. We demonstrate the improvement in the estimation accuracy for localized versions of linear regression and partial least squares, for three different models of varying complexity.
引用
收藏
页码:559 / 570
页数:12
相关论文
共 26 条
  • [1] A Novel Approach for Choosing Summary Statistics in Approximate Bayesian Computation
    Aeschbacher, Simon
    Beaumont, Mark A.
    Futschik, Andreas
    [J]. GENETICS, 2012, 192 (03) : 1027 - +
  • [2] [Anonymous], 2012, P 25 INT C NEURIPS
  • [3] The rate of convergence for approximate Bayesian computation
    Barber, Stuart
    Voss, Jochen
    Webster, Mark
    [J]. ELECTRONIC JOURNAL OF STATISTICS, 2015, 9 (01): : 80 - 105
  • [4] Beaumont MA, 2002, GENETICS, V162, P2025
  • [5] Beaumont MA., 2018, HDB APPROXIMATE BAYE, P437, DOI 10.1201/9781315117195-15
  • [6] A Comparative Review of Dimension Reduction Methods in Approximate Bayesian Computation
    Blum, M. G. B.
    Nunes, M. A.
    Prangle, D.
    Sisson, S. A.
    [J]. STATISTICAL SCIENCE, 2013, 28 (02) : 189 - 208
  • [7] Non-linear regression models for Approximate Bayesian Computation
    Blum, Michael G. B.
    Francois, Olivier
    [J]. STATISTICS AND COMPUTING, 2010, 20 (01) : 63 - 73
  • [8] Bayesian Indirect Inference Using a Parametric Auxiliary Model
    Drovandi, Christopher C.
    Pettitt, Anthony N.
    Lee, Anthony
    [J]. STATISTICAL SCIENCE, 2015, 30 (01) : 72 - 95
  • [9] Constructing summary statistics for approximate Bayesian computation: semi-automatic approximate Bayesian computation
    Fearnhead, Paul
    Prangle, Dennis
    [J]. JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 2012, 74 : 419 - 474
  • [10] Gutmann MU, 2016, J MACH LEARN RES, V17