Fast and Separable Estimation in High-Dimensional Tensor Gaussian Graphical Models

被引:16
作者
Min, Keqian [1 ]
Mai, Qing [1 ]
Zhang, Xin [1 ]
机构
[1] Florida State Univ, Dept Stat, 214 OSB,117 N Woodward Ave,POB 3064330, Tallahassee, FL 32306 USA
基金
美国国家科学基金会;
关键词
Graphical models; Kronecker covariance; Sparse precision matrix; Tensor; INVERSE COVARIANCE ESTIMATION; MATRIX; SELECTION; LASSO; CONVERGENCE; REGRESSION;
D O I
10.1080/10618600.2021.1938086
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
In the tensor data analysis, the Kronecker covariance structure plays a vital role in unsupervised learning and regression. Under the Kronecker covariance model assumption, the covariance of an M-way tensor is parameterized as the Kronecker product of M individual covariance matrices. With normally distributed tensors, the key to high-dimensional tensor graphical models becomes the sparse estimation of the M inverse covariance matrices. Unable to maximize the tensor normal likelihood analytically, existing approaches often require cyclic updates of the M sparse matrices. For the high-dimensional tensor graphical models, each update step solves a regularized inverse covariance estimation problem that is computationally nontrivial. This computational challenge motivates our study of whether a noncyclic approach can be as good as the cyclic algorithms in theory and practice. To handle the potentially very high-dimensional and high-order tensors, we propose a separable and parallel estimation scheme. We show that the new estimator achieves the same minimax optimal convergence rate as the cyclic estimation approaches. Numerically, the new estimator is much faster and often more accurate than the cyclic approach. Moreover, another advantage of the separable estimation scheme is its flexibility in modeling, where we can easily incorporate user-specified or specially structured covariances on any modes of the tensor. We demonstrate the efficiency of the proposed method through both simulations and a neuroimaging application. for this article are available online.
引用
收藏
页码:294 / 300
页数:7
相关论文
共 33 条
[1]  
Banerjee O, 2008, J MACH LEARN RES, V9, P485
[2]   Tensors in Statistics [J].
Bi, Xuan ;
Tang, Xiwei ;
Yuan, Yubai ;
Zhang, Yanqing ;
Qu, Annie .
ANNUAL REVIEW OF STATISTICS AND ITS APPLICATION, VOL 8, 2021, 2021, 8 :345-368
[3]   ESTIMATING SPARSE PRECISION MATRIX: OPTIMAL RATES OF CONVERGENCE AND ADAPTIVE ESTIMATION [J].
Cai, T. Tony ;
Liu, Weidong ;
Zhou, Harrison H. .
ANNALS OF STATISTICS, 2016, 44 (02) :455-488
[4]   ON TENSORS, SPARSITY, AND NONNEGATIVE FACTORIZATIONS [J].
Chi, Eric C. ;
Kolda, Tamara G. .
SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2012, 33 (04) :1272-1299
[5]   The joint graphical lasso for inverse covariance estimation across multiple classes [J].
Danaher, Patrick ;
Wang, Pei ;
Witten, Daniela M. .
JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 2014, 76 (02) :373-397
[6]   Sparse inverse covariance estimation with the graphical lasso [J].
Friedman, Jerome ;
Hastie, Trevor ;
Tibshirani, Robert .
BIOSTATISTICS, 2008, 9 (03) :432-441
[7]  
Gupta A. K., 2018, Matrix Variate Distributions
[8]   Graphical model selection and estimation for high dimensional tensor data [J].
He, Shiyuan ;
Yin, Jianxin ;
Li, Hongzhe ;
Wang, Xing .
JOURNAL OF MULTIVARIATE ANALYSIS, 2014, 128 :165-185
[9]  
Hoff, 2020, 200306024 ARXIV
[10]   MULTILINEAR TENSOR REGRESSION FOR LONGITUDINAL RELATIONAL DATA [J].
Hoff, Peter D. .
Annals of Applied Statistics, 2015, 9 (03) :1169-1193