A Constrained l1 Minimization Approach to Sparse Precision Matrix Estimation

被引:631
|
作者
Cai, Tony [1 ]
Liu, Weidong [1 ]
Luo, Xi [1 ]
机构
[1] Univ Penn, Wharton Sch, Dept Stat, Philadelphia, PA 19104 USA
基金
美国国家科学基金会;
关键词
Covariance matrix; Frobenius norm; Gaussian graphical model; Precision matrix; Rate of convergence; Spectral norm; VARIABLE SELECTION; COVARIANCE; CONVERGENCE; LIKELIHOOD; RECOVERY; RATES; MODEL;
D O I
10.1198/jasa.2011.tm10155
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
This article proposes a constrained l(1) minimization method for estimating a sparse inverse covariance matrix based on a sample of n iid p-variate random variables. The resulting estimator is shown to have a number of desirable properties. In particular, the rate of convergence between the estimator and the true s-sparse precision matrix under the spectral norm is s root logp/n when the population distribution has either exponential-type tails or polynomial-type tails. We present convergence rates under the elementwise l(infinity) norm and Frobenius norm. In addition, we consider graphical model selection. The procedure is easily implemented by linear programming. Numerical performance of the estimator is investigated using both simulated and real data. In particular, the procedure is applied to analyze a breast cancer dataset and is found to perform favorably compared with existing methods.
引用
收藏
页码:594 / 607
页数:14
相关论文
共 50 条
  • [21] On estimation of the diagonal elements of a sparse precision matrix
    Balmand, Samuel
    ELECTRONIC JOURNAL OF STATISTICS, 2016, 10 (01): : 1551 - 1579
  • [22] Dynamic Filtering of Time-Varying Sparse Signals via l1 Minimization
    Charles, Adam S.
    Balavoine, Aurele
    Rozell, Christopher J.
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2016, 64 (21) : 5644 - 5656
  • [23] Sparse precision matrix estimation via lasso penalized D-trace loss
    Zhang, Teng
    Zou, Hui
    BIOMETRIKA, 2014, 101 (01) : 103 - 120
  • [24] Sparse recovery by the iteratively reweighted l1 algorithm for elastic l2 - lq minimization
    Zhang, Yong
    Ye, WanZhou
    OPTIMIZATION, 2017, 66 (10) : 1677 - 1687
  • [25] A constrained 1 minimization approach for estimating multiple sparse Gaussian or nonparanormal graphical models
    Wang, Beilun
    Singh, Ritambhara
    Qi, Yanjun
    MACHINE LEARNING, 2017, 106 (9-10) : 1381 - 1417
  • [26] LOCALLY SPARSE RECONSTRUCTION USING THE l1,∞-NORM
    Heins, Pia
    Moeller, Michael
    Burger, Martin
    INVERSE PROBLEMS AND IMAGING, 2015, 9 (04) : 1093 - 1137
  • [27] A sufficient condition for restoring sparse vectors from l1 - l2-minimization with cumulative coherence
    Xie, Youwei
    Zhang, Meijiao
    Xie, Shaohua
    ELECTRONICS LETTERS, 2023, 59 (09)
  • [28] A DC Programming Approach for Sparse Estimation of a Covariance Matrix
    Duy Nhat Phan
    Hoai An Le Thi
    Tao Pham Dinh
    MODELLING, COMPUTATION AND OPTIMIZATION IN INFORMATION SYSTEMS AND MANAGEMENT SCIENCES - MCO 2015, PT 1, 2015, 359 : 131 - 142
  • [29] Estimation of high-dimensional vector autoregression via sparse precision matrix
    Poignard, Benjamin
    Asai, Manabu
    ECONOMETRICS JOURNAL, 2023, 26 (02) : 307 - 326
  • [30] Beyond L1: Faster and Better Sparse Models with skglm
    Bertrand, Quentin
    Klopfenstein, Quentin
    Bannier, Pierre-Antoine
    Gidel, Gauthier
    Massias, Mathurin
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,