Model Fusion with Kullback-Leibler Divergence

被引:0
|
作者
Claici, Sebastian [1 ,2 ]
Yurochkin, Mikhail [2 ,3 ]
Ghosh, Soumya [2 ,3 ]
Solomon, Justin [1 ,2 ]
机构
[1] MIT, CSAIL, 77 Massachusetts Ave, Cambridge, MA 02139 USA
[2] MIT IBM Watson AI Lab, Cambridge, MA USA
[3] IBM Res, Cambridge, MA USA
基金
美国国家科学基金会;
关键词
FINITE;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We propose a method to fuse posterior distributions learned from heterogeneous datasets. Our algorithm relies on a mean field assumption for both the fused model and the individual dataset posteriors and proceeds using a simple assign-and-average approach. The components of the dataset posteriors are assigned to the proposed global model components by solving a regularized variant of the assignment problem. The global components are then updated based on these assignments by their mean under a KL divergence. For exponential family variational distributions, our formulation leads to an efficient non-parametric algorithm for computing the fused model. Our algorithm is easy to describe and implement, efficient, and competitive with state-of-the-art on motion capture analysis, topic modeling, and federated learning of Bayesian neural networks.(1)
引用
收藏
页数:10
相关论文
共 50 条
  • [21] A nonparametric assessment of model adequacy based on Kullback-Leibler divergence
    Hsieh, Ping-Hung
    STATISTICS AND COMPUTING, 2013, 23 (02) : 149 - 162
  • [22] A generalization of the Kullback-Leibler divergence and its properties
    Yamano, Takuya
    JOURNAL OF MATHEMATICAL PHYSICS, 2009, 50 (04)
  • [23] Computation of Kullback-Leibler Divergence in Bayesian Networks
    Moral, Serafin
    Cano, Andres
    Gomez-Olmedo, Manuel
    ENTROPY, 2021, 23 (09)
  • [24] Kullback-Leibler Divergence Estimation of Continuous Distributions
    Perez-Cruz, Fernando
    2008 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY PROCEEDINGS, VOLS 1-6, 2008, : 1666 - 1670
  • [25] Generalization of the Kullback-Leibler divergence in the Tsallis statistics
    Huang, Juntao
    Yong, Wen-An
    Hong, Liu
    JOURNAL OF MATHEMATICAL ANALYSIS AND APPLICATIONS, 2016, 436 (01) : 501 - 512
  • [26] AN ACOUSTIC MODEL BASED ON KULLBACK-LEIBLER DIVERGENCE FOR POSTERIOR FEATURES
    Aradilla, Guillermo
    Vepa, Jithendra
    Bourlard, Herve
    2007 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, VOL IV, PTS 1-3, 2007, : 657 - +
  • [27] A Signaling Pathway Analysis Model Based on Kullback-Leibler Divergence
    Wei, Hang
    Zheng, Haoran
    Xu, Yang
    2015 INTERNATIONAL SYMPOSIUM ON BIOELECTRONICS AND BIOINFORMATICS (ISBB), 2015, : 124 - 127
  • [28] Kullback-Leibler Divergence for Nonnegative Matrix Factorization
    Yang, Zhirong
    Zhang, He
    Yuan, Zhijian
    Oja, Erkki
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2011, PT I, 2011, 6791 : 250 - 257
  • [29] Acoustic environment identification by Kullback-Leibler divergence
    Delgado-Gutierrez, G.
    Rodriguez-Santos, F.
    Jimenez-Ramirez, O.
    Vazquez-Medina, R.
    FORENSIC SCIENCE INTERNATIONAL, 2017, 281 : 134 - 140
  • [30] Minimization of the Kullback-Leibler Divergence for Nonlinear Estimation
    Darling, Jacob E.
    DeMars, Kyle J.
    JOURNAL OF GUIDANCE CONTROL AND DYNAMICS, 2017, 40 (07) : 1739 - 1748