Model Fusion with Kullback-Leibler Divergence

被引:0
|
作者
Claici, Sebastian [1 ,2 ]
Yurochkin, Mikhail [2 ,3 ]
Ghosh, Soumya [2 ,3 ]
Solomon, Justin [1 ,2 ]
机构
[1] MIT, CSAIL, 77 Massachusetts Ave, Cambridge, MA 02139 USA
[2] MIT IBM Watson AI Lab, Cambridge, MA USA
[3] IBM Res, Cambridge, MA USA
基金
美国国家科学基金会;
关键词
FINITE;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We propose a method to fuse posterior distributions learned from heterogeneous datasets. Our algorithm relies on a mean field assumption for both the fused model and the individual dataset posteriors and proceeds using a simple assign-and-average approach. The components of the dataset posteriors are assigned to the proposed global model components by solving a regularized variant of the assignment problem. The global components are then updated based on these assignments by their mean under a KL divergence. For exponential family variational distributions, our formulation leads to an efficient non-parametric algorithm for computing the fused model. Our algorithm is easy to describe and implement, efficient, and competitive with state-of-the-art on motion capture analysis, topic modeling, and federated learning of Bayesian neural networks.(1)
引用
收藏
页数:10
相关论文
共 50 条
  • [31] Modulation Classification Based on Kullback-Leibler Divergence
    Im, Chaewon
    Ahn, Seongjin
    Yoon, Dongweon
    15TH INTERNATIONAL CONFERENCE ON ADVANCED TRENDS IN RADIOELECTRONICS, TELECOMMUNICATIONS AND COMPUTER ENGINEERING (TCSET - 2020), 2020, : 373 - 376
  • [32] MINIMIZATION OF THE KULLBACK-LEIBLER DIVERGENCE FOR NONLINEAR ESTIMATION
    Darling, Jacob E.
    DeMars, Kyle J.
    ASTRODYNAMICS 2015, 2016, 156 : 213 - 232
  • [33] COMPLEX NMF WITH THE GENERALIZED KULLBACK-LEIBLER DIVERGENCE
    Kameoka, Hirokazu
    Kagami, Hideaki
    Yukawa, Masahiro
    2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 56 - 60
  • [34] The generalized Kullback-Leibler divergence and robust inference
    Park, C
    Basu, A
    JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2003, 73 (05) : 311 - 332
  • [35] On calibration of Kullback-Leibler divergence via prediction
    Keyes, TK
    Levy, MS
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 1999, 28 (01) : 67 - 85
  • [36] Estimation of kullback-leibler divergence by local likelihood
    Lee, Young Kyung
    Park, Byeong U.
    ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS, 2006, 58 (02) : 327 - 340
  • [37] The AIC criterion and symmetrizing the Kullback-Leibler divergence
    Seghouane, Abd-Krim
    Amari, Shun-Ichi
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2007, 18 (01): : 97 - 106
  • [38] Distributed Fusion of Multiple Model Estimators Using Minimum Forward Kullback-Leibler Divergence Sum
    Wei, Zheng
    Duan, Zhansheng
    Hanebeck, Uwe D.
    IEEE TRANSACTIONS ON AEROSPACE AND ELECTRONIC SYSTEMS, 2024, 60 (03) : 2934 - 2947
  • [39] Minimising the Kullback-Leibler Divergence for Model Selection in Distributed Nonlinear Systems
    Cliff, Oliver M.
    Prokopenko, Mikhail
    Fitch, Robert
    ENTROPY, 2018, 20 (02):
  • [40] Average Kullback-Leibler Divergence for Random Finite Sets
    Battistelli, Giorgio
    Chisci, Luigi
    Fantacci, Claudio
    Farina, Alfonso
    Vo, Ba-Ngu
    2015 18TH INTERNATIONAL CONFERENCE ON INFORMATION FUSION (FUSION), 2015, : 1359 - 1366