Learning causal graphs via nonlinear sufficient dimension reduction

被引:0
作者
Solea, Eftychia [1 ]
Li, Bing [2 ]
Kim, Kyongwon [3 ]
机构
[1] Queen Mary Univ London, Sch Math Sci, London E1 4NS, England
[2] Penn State Univ, Dept Stat, 326 Thomas Bldg, University Pk, PA 16802 USA
[3] Yonsei Univ, Dept Appl Stat, Dept Stat & Data Sci, 50 Yonsei ro, Seoul 03722, South Korea
基金
新加坡国家研究基金会;
关键词
causality; conditional independence; directed graphs; pc algorithm; sufficient dimension reduction; SLICED INVERSE REGRESSION; MARKOV EQUIVALENCE CLASSES; SUPPORT VECTOR MACHINES; KERNEL; NETWORKS; CONSISTENCY; MODELS; SELECTION;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We introduce a new nonparametric methodology for estimating a directed acyclic graph (DAG) from observational data. Our method is nonparametric in nature: it does not impose any specific form on the joint distribution of the underlying DAG. Instead, it relies on a linear operator on reproducing kernel Hilbert spaces to evaluate conditional independence. However, a fully nonparametric approach would involve conditioning on a large number of random variables, subjecting it to the curse of dimensionality. To solve this problem, we apply nonlinear sufficient dimension reduction to reduce the number of variables before evaluating the conditional independence. We develop an estimator for the DAG, based on a linear operator that characterizes conditional independence, and establish the consistency and convergence rates of this estimator, as well as the uniform consistency of the estimated Markov equivalence class. We introduce a modified PC-algorithm to implement the estimating procedure efficiently such that the complexity depends on the sparseness of the underlying true DAG. We demonstrate the effectiveness of our methodology through simulations and a real data analysis.
引用
收藏
页码:1 / 46
页数:46
相关论文
共 81 条
[31]  
Lacerda Gustavo, 2008, UAI 2008 P 24 C UNCE, P366
[32]   Variable selection via additive conditional independence [J].
Lee, Kuang-Yao ;
Li, Bing ;
Zhao, Hongyu .
JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 2016, 78 (05) :1037-1055
[33]  
Lee KY, 2020, J MACH LEARN RES, V21
[34]   On an additive partial correlation operator and nonparametric estimation of graphical models [J].
Lee, Kuang-Yao ;
Li, Bing ;
Zhao, Hongyu .
BIOMETRIKA, 2016, 103 (03) :513-530
[35]   A GENERAL THEORY FOR NONLINEAR SUFFICIENT DIMENSION REDUCTION: FORMULATION AND ESTIMATION [J].
Lee, Kuang-Yao ;
Li, Bing ;
Chiaromonte, Francesca .
ANNALS OF STATISTICS, 2013, 41 (01) :221-249
[36]   Reduced support vector machines: A statistical theory [J].
Lee, Yuh-Jye ;
Huang, Su-Yun .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2007, 18 (01) :1-13
[37]  
Li B., 2018, Sufficient dimension reduction: Methods and applications with R, DOI DOI 10.1201/9781315119427
[38]   A Nonparametric Graphical Model for Functional Data With Application to Brain Networks Based on fMRI [J].
Li, Bing ;
Solea, Eftychia .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2018, 113 (524) :1637-1655
[39]   Linear operator-based statistical analysis: A useful paradigm for big data [J].
Li, Bing .
CANADIAN JOURNAL OF STATISTICS-REVUE CANADIENNE DE STATISTIQUE, 2018, 46 (01) :79-103
[40]   NONLINEAR SUFFICIENT DIMENSION REDUCTION FOR FUNCTIONAL DATA [J].
Li, Bing ;
Song, Jun .
ANNALS OF STATISTICS, 2017, 45 (03) :1059-1095