Variable selection in multivariate linear models with high-dimensional covariance matrix estimation

被引:8
作者
Perrot-Dockes, Marie [1 ]
Levy-Leduc, Celine [1 ]
Sansonnet, Laure [1 ]
Chiquet, Julien [1 ]
机构
[1] Univ Paris Saclay, UMR MIA Paris, AgroParisTech, INRA, F-75005 Paris, France
关键词
High-dimensional covariance matrix estimation; Lasso; Multivariate linear model; Variable selection; MAXIMUM-LIKELIHOOD; REGRESSION; LASSO;
D O I
10.1016/j.jmva.2018.02.006
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
In this paper, we propose a novel variable selection approach in the framework of multivariate linear models taking into account the dependence that may exist between the responses. It consists in estimating beforehand the covariance matrix. of the responses and to plug this estimator in a Lasso criterion, in order to obtain a sparse estimator of the coefficient matrix. The properties of our approach are investigated both from a theoretical and a numerical point of view. More precisely, we give general conditions that the estimators of the covariance matrix and its inverse have to satisfy in order to recover the positions of the null and non null entries of the coefficient matrix when the size of Z is not fixed and can tend to infinity. We prove that these conditions are satisfied in the particular case of some Toeplitz matrices. Our approach is implemented in the R package MultiVarSel available from the Comprehensive R Archive Network (CRAN) and is very attractive since it benefits from a low computational load. We also assess the performance of our methodology using synthetic data and compare it with alternative approaches. Our numerical experiments show that including the estimation of the covariance matrix in the Lasso criterion dramatically improves the variable selection performance in many cases. (C) 2018 Elsevier Inc. All rights reserved.
引用
收藏
页码:78 / 97
页数:20
相关论文
共 19 条
[1]   Sparsity considerations for dependent variables [J].
Alquier, Pierre ;
Doukhan, Paul .
ELECTRONIC JOURNAL OF STATISTICS, 2011, 5 :750-774
[2]  
[Anonymous], 2005, NEW INTRO MULTIPLE T
[3]  
[Anonymous], 2013, MATRIX ANAL
[4]  
[Anonymous], 1990, Time Series: Theory and Methods
[5]  
Banerjee O, 2008, J MACH LEARN RES, V9, P485
[6]   Structured regularization for conditional Gaussian graphical models [J].
Chiquet, Julien ;
Mary-Huard, Tristan ;
Robin, Stephane .
STATISTICS AND COMPUTING, 2017, 27 (03) :789-804
[7]   Variable selection via nonconcave penalized likelihood and its oracle properties [J].
Fan, JQ ;
Li, RZ .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2001, 96 (456) :1348-1360
[8]   Sparse inverse covariance estimation with the graphical lasso [J].
Friedman, Jerome ;
Hastie, Trevor ;
Tibshirani, Robert .
BIOSTATISTICS, 2008, 9 (03) :432-441
[9]   Simultaneous multiple response regression and inverse covariance matrix estimation via penalized Gaussian maximum likelihood [J].
Lee, Wonyul ;
Liu, Yufeng .
JOURNAL OF MULTIVARIATE ANALYSIS, 2012, 111 :241-255
[10]   A review of variable selection methods in Partial Least Squares Regression [J].
Mehmood, Tahir ;
Liland, Kristian Hovde ;
Snipen, Lars ;
Saebo, Solve .
CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS, 2012, 118 :62-69