Irreversible Langevin samplers and variance reduction: a large deviations approach

被引:60
作者
Rey-Bellet, Luc [1 ]
Spiliopoulos, Konstantinos [2 ]
机构
[1] Univ Massachusetts, Dept Math & Stat, Amherst, MA 01003 USA
[2] Boston Univ, Dept Math & Stat, Boston, MA 02215 USA
基金
美国国家科学基金会;
关键词
non reversible Monte Carlo Markov chains; large deviations; variance reduction; Langevin equation; MARKOV PROCESS EXPECTATIONS; ASYMPTOTIC EVALUATION; CONVERGENCE;
D O I
10.1088/0951-7715/28/7/2081
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
In order to sample from a given target distribution (often of Gibbs type), the Monte Carlo Markov chain method consists of constructing an ergodic Markov process whose invariant measure is the target distribution. By sampling the Markov process one can then compute, approximately, expectations of observables with respect to the target distribution. Often the Markov processes used in practice are time-reversible (i.e. they satisfy detailed balance), but our main goal here is to assess and quantify how the addition of a non-reversible part to the process can be used to improve the sampling properties. We focus on the diffusion setting (overdamped Langevin equations) where the drift consists of a gradient vector field as well as another drift which breaks the reversibility of the process but is chosen to preserve the Gibbs measure. In this paper we use the large deviation rate function for the empirical measure as a tool to analyze the speed of convergence to the invariant measure. We show that the addition of an irreversible drift leads to a larger rate function and it strictly improves the speed of convergence of ergodic average for (generic smooth) observables. We also deduce from this result that the asymptotic variance decreases under the addition of the irreversible drift and we give an explicit characterization of the observables whose variance is not reduced reduced, in terms of a nonlinear Poisson equation. Our theoretical results are illustrated and supplemented by numerical simulations.
引用
收藏
页码:2081 / 2103
页数:23
相关论文
共 50 条
  • [41] A unified approach to the large deviations for small perturbations of random evolution equations
    Yijun Hu
    Science in China Series A: Mathematics, 1997, 40 : 697 - 706
  • [42] A unified approach to the large deviations for small perturbations of random evolution equations
    Hu, YJ
    SCIENCE IN CHINA SERIES A-MATHEMATICS PHYSICS ASTRONOMY, 1997, 40 (07): : 697 - 706
  • [43] Efficient variance reduction approach based on the variation of the input importance
    Wang, Pan
    Lu, Zhenzhou
    JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2016, 86 (14) : 2856 - 2873
  • [44] Large deviations and sum rules for spectral theory. A pedagogical approach
    Breuer, Jonathan
    Simon, Barry
    Zeitouni, Ofer
    JOURNAL OF SPECTRAL THEORY, 2018, 8 (04) : 1551 - 1581
  • [45] False Discovery Variance Reduction in Large Scale Simultaneous Hypothesis Tests
    Rayaprolu, Sairam
    Chi, Zhiyi
    METHODOLOGY AND COMPUTING IN APPLIED PROBABILITY, 2021, 23 (03) : 711 - 733
  • [46] Large deviations approach to asymptotically optimal control of crisscross network in heavy traffic
    Budhiraja, A
    Ghosh, AP
    ANNALS OF APPLIED PROBABILITY, 2005, 15 (03) : 1887 - 1935
  • [47] First passage times of general sequences of random vectors: A large deviations approach
    Collamore, JF
    STOCHASTIC PROCESSES AND THEIR APPLICATIONS, 1998, 78 (01) : 97 - 130
  • [48] A large deviations approach to limit theory for heavy-tailed time series
    Mikosch, Thomas
    Wintenberger, Olivier
    PROBABILITY THEORY AND RELATED FIELDS, 2016, 166 (1-2) : 233 - 269
  • [49] Large-scale k-means clustering via variance reduction
    Zhao, Yawei
    Ming, Yuewei
    Liu, Xinwang
    Zhu, En
    Zhao, Kaikai
    Yin, Jianping
    NEUROCOMPUTING, 2018, 307 : 184 - 194
  • [50] Accelerating SGD using flexible variance reduction on large-scale datasets
    Tang, Mingxing
    Qiao, Linbo
    Huang, Zhen
    Liu, Xinwang
    Peng, Yuxing
    Liu, Xueliang
    NEURAL COMPUTING & APPLICATIONS, 2020, 32 (12) : 8089 - 8100