Online model selection based on the variational bayes

被引:258
作者
Sato, M [1 ]
机构
[1] ATR Int, Div Informat Sci, Kyoto 6190288, Japan
[2] Japan Sci & Technol Corp, CREST, Kyoto 6190288, Japan
关键词
D O I
10.1162/089976601750265045
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The Bayesian framework provides a principled way of model selection. This framework estimates a probability distribution over an ensemble of models, and the prediction is done by averaging over the ensemble of models. Accordingly, the uncertainty of the models is taken into account, and complex models with more degrees of freedom are penalized. However, integration over model parameters is often intractable, and some approximation scheme is needed. Recently, a powerful approximation scheme, called the variational bayes (VB) method, has been proposed. This approach defines the free energy for a trial probability distribution, which approximates a joint posterior probability distribution over model parameters and hidden variables. The exact maximization of the free energy gives the true posterior distribution. The VB method uses factorized trial distributions. The integration over model parameters can be done analytically, and an iterative expectation-maximization-like algorithm, whose convergence is guaranteed, is derived. In this article, we derive an online version of the VB algorithm and prove its convergence by showing that it is a stochastic approximation for finding the maximum of the free energy. By combining sequential model selection procedures, the online VB method provides a fully online learning method with a model selection mechanism. In preliminary experiments using synthetic data, the online VB method was able to adapt the model structure to dynamic environments.
引用
收藏
页码:1649 / 1681
页数:33
相关论文
共 50 条
  • [31] A variational Bayes model for count data learning and classification
    Bakhtiari, Ali Shojaee
    Bouguila, Nizar
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2014, 35 : 176 - 186
  • [32] Image Denoising method based on NSCT bivariate model and Variational Bayes threshold estimation
    Wang Deyan
    Xiao Yin
    Gao Ya
    Multimedia Tools and Applications, 2019, 78 : 8927 - 8941
  • [33] Tilted Variational Bayes
    Hensman, James
    Zwiessele, Max
    Lawrence, Neil D.
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 33, 2014, 33 : 356 - 364
  • [34] Variational Bayes on manifolds
    Minh-Ngoc Tran
    Dang H. Nguyen
    Duy Nguyen
    Statistics and Computing, 2021, 31
  • [35] Variational Bayes on manifolds
    Minh-Ngoc Tran
    Dang H Nguyen
    Duy Nguyen
    STATISTICS AND COMPUTING, 2021, 31 (06)
  • [36] Nonlinear filtering for spaceborne radars based on variational Bayes
    Yan W.
    Lan H.
    Wang Z.
    Jin S.
    Pan Q.
    Lan, Hua (lanhua@nwpu.edu.cn), 1600, Chinese Society of Astronautics (41):
  • [37] Distributed Variational Bayes Based on Consensus of Probability Densities
    Lin, Peng
    Hu, Chen
    Lou, Yu
    PROCEEDINGS OF THE 39TH CHINESE CONTROL CONFERENCE, 2020, : 5013 - 5018
  • [38] Variational Bayes based approach to robust subspace learning
    Okatani, Takayuki
    Deguchi, Koichiro
    2007 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, VOLS 1-8, 2007, : 1004 - +
  • [39] Distributed Fusion Target Tracking Based on Variational Bayes
    Hu Z.-T.
    Yang S.-B.
    Hu Y.-M.
    Zhou L.
    Jin Y.
    Yang L.-L.
    Tien Tzu Hsueh Pao/Acta Electronica Sinica, 2022, 50 (05): : 1058 - 1065
  • [40] Linear Gaussian Regression Filter based on Variational Bayes
    Wang, Xiaoxu
    Cui, Haoran
    Pan, Quan
    Liang, Yan
    Hu, Jinwen
    Xu, Zhao
    2018 21ST INTERNATIONAL CONFERENCE ON INFORMATION FUSION (FUSION), 2018, : 2072 - 2077