Origo: causal inference by compression

被引:0
作者
Kailash Budhathoki
Jilles Vreeken
机构
[1] Max Planck Institute for Informatics and Saarland University,
来源
Knowledge and Information Systems | 2018年 / 56卷
关键词
Causal inference; Kolmogorov complexity; MDL; Decision trees; Binary data;
D O I
暂无
中图分类号
学科分类号
摘要
Causal inference from observational data is one of the most fundamental problems in science. In general, the task is to tell whether it is more likely that X\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$X$$\end{document} caused Y\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$Y$$\end{document}, or vice versa, given only data over their joint distribution. In this paper we propose a general inference framework based on Kolmogorov complexity, as well as a practical and computable instantiation based on the Minimum Description Length principle. Simply put, we propose causal inference by compression. That is, we infer that X\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$X$$\end{document} is a likely cause of Y\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$Y$$\end{document} if we can better compress the data by first encoding X\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$X$$\end{document}, and then encoding Y\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$Y$$\end{document} given X\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$X$$\end{document}, than in the other direction. To show this works in practice, we propose Origo, an efficient method for inferring the causal direction from binary data. Origo employs the lossless Pack compressor and searches for that set of decision trees that encodes the data most succinctly. Importantly, it works directly on the data and does not require assumptions about neither distributions nor the type of causal relations. To evaluate Origo in practice, we provide extensive experiments on synthetic, benchmark, and real-world data, including three case studies. Altogether, the experiments show that Origo reliably infers the correct causal direction on a wide range of settings.
引用
收藏
页码:285 / 307
页数:22
相关论文
共 52 条
[1]  
Breiman L(1996)Bagging predictors Mach Learn 24 123-140
[2]  
Chaitin GJ(1969)On the simplicity and speed of programs for computing infinite sets of natural numbers J ACM 16 407-422
[3]  
De Bie T(2011)Maximum entropy models and subjective interestingness: an application to tiles in binary databases Data Min Knowl Discov 23 407-446
[4]  
Deutsch D(1985)Quantum theory, the Church–Turing principle and the universal quantum computer Proc R Soc A (Math Phys Eng Sci) 400 97-117
[5]  
Gionis A(2007)Assessing data mining results via swap randomization ACM Trans Knowl Discov Data 1 167-176
[6]  
Mannila H(2010)Causal inference using the algorithmic Markov condition IEEE Trans Inf Technol 56 5168-5194
[7]  
Mielikäinen T(2010)Justifying additive noise model-based causal discovery via algorithmic information theory Open Syst Inf Dyn 17 189-212
[8]  
Tsaparas P(2012)Information-geometric approach to inferring causal directions Artif Intell 182–183 1-31
[9]  
Janzing D(1965)Three approaches to the quantitative definition of information Probl Inf Transm 1 1-7
[10]  
Schölkopf B(2007)A linear-time algorithm for computing the multinomial stochastic complexity Inf Process Lett 103 227-233