Stochastic Variational Inference with Gradient Linearization

被引:1
|
作者
Ploetz, Tobias [1 ]
Wannenwetsch, Anne S. [1 ]
Roth, Stefan [1 ]
机构
[1] Tech Univ Darmstadt, Dept Comp Sci, Darmstadt, Germany
来源
2018 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR) | 2018年
基金
欧洲研究理事会;
关键词
IMAGE; RESTORATION;
D O I
10.1109/CVPR.2018.00169
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Variational inference has experienced a recent surge in popularity owing to stochastic approaches, which have yielded practical tools for a wide range of model classes. A key benefit is that stochastic variational inference obviates the tedious process of deriving analytical expressions for closed-form variable updates. Instead, one simply needs to derive the gradient of the log-posterior, which is often much easier. Yet for certain model classes, the log-posterior itself is difficult to optimize using standard gradient techniques. One such example are random field models, where optimization based on gradient linearization has proven popular, since it speeds up convergence significantly and can avoid poor local optima. In this paper we propose stochastic variational inference with gradient linearization (SVIGL). It is similarly convenient as standard stochastic variational inference - all that is required is a local linearization of the energy gradient. Its benefit over stochastic variational inference with conventional gradient methods is a clear improvement in convergence speed, while yielding comparable or even better variational approximations in terms of KL divergence. We demonstrate the benefits of SVIGL in three applications: Optical flow estimation, Poisson-Gaussian denoising, and 3D surface reconstruction.
引用
收藏
页码:1566 / 1575
页数:10
相关论文
共 50 条
  • [1] Stochastic Variational Inference
    Hoffman, Matthew D.
    Blei, David M.
    Wang, Chong
    Paisley, John
    JOURNAL OF MACHINE LEARNING RESEARCH, 2013, 14 : 1303 - 1347
  • [2] Stochastic variational inference
    Hoffman, Matthew D.
    Blei, David M.
    Wang, Chong
    Paisley, John
    1600, Microtome Publishing (14): : 1303 - 1347
  • [3] Variational Inference of Dirichlet Process Mixture using Stochastic Gradient Ascent
    Lim, Kart-Leong
    ICPRAM: PROCEEDINGS OF THE 9TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION APPLICATIONS AND METHODS, 2020, : 33 - 42
  • [4] Accelerated Stochastic Variational Inference
    Hu, Pingbo
    Weng, Yang
    2019 IEEE INTL CONF ON PARALLEL & DISTRIBUTED PROCESSING WITH APPLICATIONS, BIG DATA & CLOUD COMPUTING, SUSTAINABLE COMPUTING & COMMUNICATIONS, SOCIAL COMPUTING & NETWORKING (ISPA/BDCLOUD/SOCIALCOM/SUSTAINCOM 2019), 2019, : 1275 - 1282
  • [5] Structured Stochastic Variational Inference
    Hoffman, Matthew D.
    Blei, David M.
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 38, 2015, 38 : 361 - 369
  • [6] Stochastic gradient descent performs variational inference, converges to limit cycles for deep networks
    Chaudhari, Pratik
    Soatto, Stefano
    2018 INFORMATION THEORY AND APPLICATIONS WORKSHOP (ITA), 2018,
  • [7] Gradient Regularization as Approximate Variational Inference
    Unlu, Ali
    Aitchison, Laurence
    ENTROPY, 2021, 23 (12)
  • [8] Natural conjugate gradient in variational inference
    Honkela, Antti
    Tornio, Matti
    Raiko, Tapani
    Karhunen, Juha
    NEURAL INFORMATION PROCESSING, PART II, 2008, 4985 : 305 - 314
  • [9] Noisy Natural Gradient as Variational Inference
    Zhang, Guodong
    Sun, Shengyang
    Duvenaud, David
    Grosse, Roger
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [10] Training Variational Autoencoders with Buffered Stochastic Variational Inference
    Shu, Rui
    Bui, Hung H.
    Whang, Jay
    Ermon, Stefano
    22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89