Reduced-Rank Tensor-on-Tensor Regression and Tensor-Variate Analysis of Variance

被引:8
作者
Llosa-Vite, Carlos [1 ]
Maitra, Ranjan [1 ]
机构
[1] Iowa State Univ, Dept Stat, Ames, IA 50011 USA
基金
美国食品与农业研究所;
关键词
CP decomposition; HOLQ; HOSVD; kronecker separable models; LFW dataset; multilinear statistics; multiway regression; random tensors; suicide ideation; tensor train format; tensor ring format; tucker format; EMOTION; CORTEX; FMRI; MODELS; VISUALIZATION; DECOMPOSITION; ALGORITHM; SELECTION; SOFTWARE; PRODUCT;
D O I
10.1109/TPAMI.2022.3164836
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Fitting regression models with many multivariate responses and covariates can be challenging, but such responses and covariates sometimes have tensor-variate structure. We extend the classical multivariate regression model to exploit such structure in two ways: first, we impose four types of low-rank tensor formats on the regression coefficients. Second, we model the errors using the tensor-variate normal distribution that imposes a Kronecker separable format on the covariance matrix. We obtain maximum likelihood estimators via block-relaxation algorithms and derive their computational complexity and asymptotic distributions. Our regression framework enables us to formulate tensor-variate analysis of variance (TANOVA) methodology. This methodology, when applied in a one-way TANOVA layout, enables us to identify cerebral regions significantly associated with the interaction of suicide attempters or non-attemptor ideators and positive-, negative- or death-connoting words in a functional Magnetic Resonance Imaging study. Another application uses three-way TANOVA on the Labeled Faces in the Wild image dataset to distinguish facial characteristics related to ethnic origin, age group and gender. A R package totr implements the methodology.
引用
收藏
页码:2282 / 2296
页数:15
相关论文
共 79 条
[21]  
De Leeuw J., 1994, Information Systems and Data Analysis: Prospects, Foundations, Applications, P308
[22]   Matrix variate regressions and envelope models [J].
Ding, Shanshan ;
Cook, R. Dennis .
JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 2018, 80 (02) :387-408
[23]   Sparse inverse covariance estimation with the graphical lasso [J].
Friedman, Jerome ;
Hastie, Trevor ;
Tibshirani, Robert .
BIOSTATISTICS, 2008, 9 (03) :432-441
[24]   Thresholding of statistical maps in functional neuroimaging using the false discovery rate [J].
Genovese, CR ;
Lazar, NA ;
Nichols, T .
NEUROIMAGE, 2002, 15 (04) :870-878
[25]   A higher-order LQ decomposition for separable covariance models [J].
Gerard, David ;
Hoff, Peter .
LINEAR ALGEBRA AND ITS APPLICATIONS, 2016, 505 :57-84
[26]   An expectation-maximization algorithm for the matrix normal distribution with an application in remote sensing [J].
Glanz, Hunter ;
Carvalho, Luis .
JOURNAL OF MULTIVARIATE ANALYSIS, 2018, 167 :31-48
[27]   HIERARCHICAL SINGULAR VALUE DECOMPOSITION OF TENSORS [J].
Grasedyck, Lars .
SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2010, 31 (04) :2029-2054
[28]  
Guhaniyogi R, 2017, J MACH LEARN RES, V18, P1
[29]  
Gupta A., 1999, Matrix Variate Distributions
[30]   A New Scheme for the Tensor Representation [J].
Hackbusch, W. ;
Kuehn, S. .
JOURNAL OF FOURIER ANALYSIS AND APPLICATIONS, 2009, 15 (05) :706-722