Investigation of Alternative Measures for Mutual Information

被引:2
作者
Kuskonmaz, Bulut [1 ]
Gundersen, Jaron S. [1 ]
Wisniewski, Rafal [1 ]
机构
[1] Aalborg Univ, Dept Elect Syst, Aalborg, Denmark
关键词
D O I
10.1016/j.ifacol.2022.09.016
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Mutual information I(X; Y) is a useful definition in information theory to estimate how much information the random variable Y holds about the random variable X. One way to define the mutual information is by comparing the joint distribution of X and Y with the product of the marginals through the Kullback-Leibler (KL) divergence. If the two distributions are close to each other there will be almost no leakage of X from Y since the two variables are close to being independent. In the discrete setting the mutual information has the nice interpretation of how many bits Y reveals about X. However, in the continuous case we do not have the same reasoning. This fact enables us to try different metrics or divergences to define the mutual information. In this paper, we are evaluating different metrics and divergences to form alternatives to the mutual information in the continuous case. We deploy different methods to estimate or bound these metrics and divergences and evaluate their performances. Copyright (C) 2022 The Authors.
引用
收藏
页码:154 / 159
页数:6
相关论文
共 18 条
[1]   Integrating structured biological data by Kernel Maximum Mean Discrepancy [J].
Borgwardt, Karsten M. ;
Gretton, Arthur ;
Rasch, Malte J. ;
Kriegel, Hans-Peter ;
Schoelkopf, Bernhard ;
Smola, Alex J. .
BIOINFORMATICS, 2006, 22 (14) :E49-E57
[2]   Leakage Assessment Through Neural Estimation of the Mutual Information [J].
Cristiani, Valence ;
Lecomte, Maxime ;
Maurine, Philippe .
APPLIED CRYPTOGRAPHY AND NETWORK SECURITY WORKSHOPS, ACNS 2020, 2020, 12418 :144-162
[3]  
Cuff P., 2016, P 2016 ACM SIGSAC C, P43, DOI DOI 10.1145/2976749.2978308
[4]  
Cuturi M, 2013, Advances in Neural Information Processing Systems (NeurIPS), P2292
[5]  
Durrieu J.-L, 2012, Proceedings of the 2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP 2012), P4833, DOI 10.1109/ICASSP.2012.6289001
[6]  
Farokhi F., 2020, MODELLING QUANTIFYIN
[7]  
Haasler I, 2021, IEEE CONTR SYST MAG, V41, P50, DOI 10.1109/MCS.2021.3076540
[8]   Privacy-Preserving Distributed Processing: Metrics, Bounds and Algorithms [J].
Li, Qiongxiu ;
Gundersen, Jaron Skovsted ;
Heusdens, Richard ;
Christensen, Mads Graesboll .
IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, 2021, 16 :2090-2103
[9]  
Muandet K, 2020, Arxiv, DOI arXiv:1605.09522
[10]   Integral probability metrics and their generating classes of functions [J].
Muller, A .
ADVANCES IN APPLIED PROBABILITY, 1997, 29 (02) :429-443