Missing Values in Multiple Joint Inference of Gaussian Graphical Models

被引:0
作者
Tozzo, Veronica [1 ,2 ,3 ]
Garbarino, Davide
Barla, Annalisa [4 ]
机构
[1] Massachusetts Gen Hosp, Ctr Syst Biol, Boston, MA 02114 USA
[2] Massachusetts Gen Hosp, Dept Pathol, Boston, MA 02114 USA
[3] Harvard Med Sch, Dept Syst Biol, Boston, MA 02115 USA
[4] Univ Genoa, Dept Informat Bioengn Robot & Syst Engn, Genoa, Italy
来源
INTERNATIONAL CONFERENCE ON PROBABILISTIC GRAPHICAL MODELS, VOL 138 | 2020年 / 138卷
关键词
Missing data; Multiple joint network inference; Multi-class; Time-series; INVERSE COVARIANCE ESTIMATION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Real-world phenomena are often not fully measured or completely observable, raising the so-called missing data problem. As a consequence, the need of developing ad-hoc techniques that cope with such issue arises in many inference contexts. In this paper, we focus on the inference of Gaussian Graphical Models (GGMs) from multiple input datasets having complex relationships (e.g. multi-class or temporal). We propose a method that generalises state-of-the-art approaches to the inference of both multi-class and temporal GGMs while naturally dealing with two types of missing data: partial and latent. Synthetic experiments show that our performance is better than state-of-the-art. In particular, we compared results with single network inference methods that suitably deal with missing data, and multiple joint network inference methods coupled with standard pre-processing techniques (e.g. imputing). When dealing with fully observed datasets our method analytically reduces to state-of-the-art approaches providing a good alternative as our implementation reaches convergence in shorter or comparable time. Finally, we show that properly addressing the missing data problem in a multi-class real-world example, allows us to discover interesting varying patterns.
引用
收藏
页码:497 / 508
页数:12
相关论文
共 25 条
[1]  
[Anonymous], 2009, Advances in Neural Information Processing Systems
[2]   Distributed optimization and statistical learning via the alternating direction method of multipliers [J].
Boyd S. ;
Parikh N. ;
Chu E. ;
Peleato B. ;
Eckstein J. .
Foundations and Trends in Machine Learning, 2010, 3 (01) :1-122
[3]  
Chandrasekaran V., 2010, 2010 48th Annual Allerton Conference on Communication, Control, and Computing (Allerton), P1610, DOI 10.1109/ALLERTON.2010.5707106
[4]   RANK-SPARSITY INCOHERENCE FOR MATRIX DECOMPOSITION [J].
Chandrasekaran, Venkat ;
Sanghavi, Sujay ;
Parrilo, Pablo A. ;
Willsky, Alan S. .
SIAM JOURNAL ON OPTIMIZATION, 2011, 21 (02) :572-596
[5]  
Chang A, 2019, 2019 IEEE DATA SCIENCE WORKSHOP (DSW), P57, DOI [10.1109/dsw.2019.8755783, 10.1109/DSW.2019.8755783]
[6]  
Choi MJ, 2011, J MACH LEARN RES, V12, P1771
[7]   The joint graphical lasso for inverse covariance estimation across multiple classes [J].
Danaher, Patrick ;
Wang, Pei ;
Witten, Daniela M. .
JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 2014, 76 (02) :373-397
[8]   MAXIMUM LIKELIHOOD FROM INCOMPLETE DATA VIA EM ALGORITHM [J].
DEMPSTER, AP ;
LAIRD, NM ;
RUBIN, DB .
JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-METHODOLOGICAL, 1977, 39 (01) :1-38
[9]  
Foti N. J., 2016, 2 KDD WORKSH MIN LEA
[10]   Sparse inverse covariance estimation with the graphical lasso [J].
Friedman, Jerome ;
Hastie, Trevor ;
Tibshirani, Robert .
BIOSTATISTICS, 2008, 9 (03) :432-441