Self-Adversarially Learned Bayesian Sampling

被引:0
|
作者
Zhao, Yang [1 ]
Zhang, Jianyi [2 ]
Chen, Changyou [1 ]
机构
[1] SUNY Buffalo, Buffalo, NY 14260 USA
[2] Fudan Univ, Shanghai, Peoples R China
来源
THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE | 2019年
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Scalable Bayesian sampling is playing an important role in modern machine learning, especially in the fast-developed unsupervised-(deep)-learning models. While tremendous progresses have been achieved via scalable Bayesian sampling such as stochastic gradient MCMC (SG-MCMC) and Stein variational gradient descent (SVGD), the generated samples are typically highly correlated. Moreover, their sample-generation processes are often criticized to be inefficient. In this paper, we propose a novel self-adversarial learning framework that automatically learns a conditional generator to mimic the behavior of a Markov kernel (transition kernel). High-quality samples can be efficiently generated by direct forward passes though a learned generator. Most importantly, the learning process adopts a self-learning paradigm, requiring no information on existing Markov kernels, e.g., knowledge of how to draw samples from them. Specifically, our framework learns to use current samples, either from the generator or pre-provided training data, to update the generator such that the generated samples progressively approach a target distribution, thus it is called self-learning. Experiments on both synthetic and real datasets verify advantages of our framework, outperforming related methods in terms of both sampling efficiency and sample quality.
引用
收藏
页码:5893 / 5900
页数:8
相关论文
共 50 条
  • [31] Efficient sampling for Bayesian inference of conjunctive Bayesian networks
    Sakoparnig, Thomas
    Beerenwinkel, Niko
    BIOINFORMATICS, 2012, 28 (18) : 2318 - 2324
  • [32] Bayesian sampling in visual perception
    Moreno-Bote, Ruben
    Knill, David C.
    Pouget, Alexandre
    PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2011, 108 (30) : 12491 - 12496
  • [33] GIBBS SAMPLING IN BAYESIAN NETWORKS
    HRYCEJ, T
    ARTIFICIAL INTELLIGENCE, 1990, 46 (03) : 351 - 363
  • [34] Cutset sampling for Bayesian networks
    Bidyuk, Bozhena
    Dechter, Rina
    JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH, 2007, 28 : 1 - 48
  • [35] A BAYESIAN METHOD FOR WEIGHTED SAMPLING
    LO, AY
    ANNALS OF STATISTICS, 1993, 21 (04): : 2138 - 2148
  • [36] BAYESIAN SAMPLING ESTIMATION OF MIXTURES
    DIEBOLT, J
    ROBERT, C
    COMPTES RENDUS DE L ACADEMIE DES SCIENCES SERIE I-MATHEMATIQUE, 1990, 311 (10): : 653 - 658
  • [37] Variational Bayesian Optimistic Sampling
    O'Donoghue, Brendan
    Lattimore, Tor
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021,
  • [38] Sampling for Bayesian Program Learning
    Ellis, Kevin
    Solar-Lezama, Armando
    Tenenbaum, Joshua B.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [39] Learned Full-Sampling Reconstruction
    Cheng, Weilin
    Wang, Yu
    Chi, Ying
    Xie, Xuansong
    Duan, Yuping
    MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION - MICCAI 2019, PT V, 2019, 11768 : 375 - 384
  • [40] ALOC: Attack-Aware by Utilizing the Adversarially Learned One-Class Classifier for SCADA System
    Li, Wenxuan
    Yao, Yu
    Sheng, Chuan
    Zhang, Ni
    Yang, Wei
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (13): : 23444 - 23459