Faster MCMC for Gaussian latent position network models

被引:1
|
作者
Spencer, Neil A. [1 ]
Junker, Brian W. [2 ]
Sweet, Tracy M. [3 ]
机构
[1] Harvard Univ, Boston, MA 02115 USA
[2] Carnegie Mellon Univ, Pittsburgh, PA 15213 USA
[3] Univ Maryland, College Pk, MD 20742 USA
基金
美国国家科学基金会; 加拿大自然科学与工程研究理事会;
关键词
Hamiltonian Monte Carlo; network data; firefly Monte Carlo; latent space model; longitudinal network data; Bayesian computation; HAMILTONIAN MONTE-CARLO; INFERENCE;
D O I
10.1017/nws.2022.1
中图分类号
O1 [数学]; C [社会科学总论];
学科分类号
03 ; 0303 ; 0701 ; 070101 ;
摘要
Latent position network models are a versatile tool in network science; applications include clustering entities, controlling for causal confounders, and defining priors over unobserved graphs. Estimating each node's latent position is typically framed as a Bayesian inference problem, with Metropolis within Gibbs being the most popular tool for approximating the posterior distribution. However, it is well-known that Metropolis within Gibbs is inefficient for large networks; the acceptance ratios are expensive to compute, and the resultant posterior draws are highly correlated. In this article, we propose an alternative Markov chain Monte Carlo strategy-defined using a combination of split Hamiltonian Monte Carlo and Firefly Monte Carlo-that leverages the posterior distribution's functional form for more efficient posterior computation. We demonstrate that these strategies outperform Metropolis within Gibbs and other algorithms on synthetic networks, as well as on real information-sharing networks of teachers and staff in a school district.
引用
收藏
页码:20 / 45
页数:26
相关论文
共 50 条
  • [21] Bayesian Leave-One-Out Cross Validation Approximations for Gaussian Latent Variable Models
    Vehtari, Aki
    Mononen, Tommi
    Tolvanen, Ville
    Sivula, Tuomas
    Winther, Ole
    JOURNAL OF MACHINE LEARNING RESEARCH, 2016, 17
  • [22] Joint latent space models for network data with high-dimensional node variables
    Zhang, Xuefei
    Xu, Gongjun
    Zhu, Ji
    BIOMETRIKA, 2022, 109 (03) : 707 - 720
  • [23] Particle MCMC With Poisson Resampling: Parallelization and Continuous Time Models
    Cakala, Tomasz
    Miasojedow, Blazej
    Niemiro, Wojciech
    JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2021, 30 (03) : 671 - 684
  • [24] Model Dispersion with PRISM: An Alternative to MCMC for Rapid Analysis of Models
    van der Velden, Ellert
    Duffy, Alan R.
    Croton, Darren
    Mutch, Simon J.
    Sinha, Manodeep
    ASTROPHYSICAL JOURNAL SUPPLEMENT SERIES, 2019, 242 (02)
  • [25] Tailored randomized block MCMC methods with application to DSGE models
    Chib, Siddhartha
    Ramamurthy, Srikanth
    JOURNAL OF ECONOMETRICS, 2010, 155 (01) : 19 - 38
  • [26] Learning Deep Generative Models With Doubly Stochastic Gradient MCMC
    Du, Chao
    Zhu, Jun
    Zhang, Bo
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (07) : 3084 - 3096
  • [27] Well-tempered MCMC simulations for population pharmacokinetic models
    Bois, Frederic Y.
    Hsieh, Nan-Hung
    Gao, Wang
    Chiu, Weihsueh A.
    Reisfeld, Brad
    JOURNAL OF PHARMACOKINETICS AND PHARMACODYNAMICS, 2020, 47 (06) : 543 - 559
  • [28] Comment: Extending the Latent Position Model for Networks
    Raftery, Adrian E.
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2017, 112 (520) : 1531 - 1534
  • [29] Variational inference for the latent shrinkage position model
    Gwee, Xian Yao
    Gormley, Isobel Claire
    Fop, Michael
    STAT, 2024, 13 (02):
  • [30] RAO-BLACKWELLIZED PARTICLE MCMC FOR PARAMETER ESTIMATION IN SPATIO-TEMPORAL GAUSSIAN PROCESSES
    Hostettler, Roland
    Sarkka, Simo
    Godsill, Simon J.
    2017 IEEE 27TH INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING, 2017,