A Constrained l1 Minimization Approach to Sparse Precision Matrix Estimation

被引:631
|
作者
Cai, Tony [1 ]
Liu, Weidong [1 ]
Luo, Xi [1 ]
机构
[1] Univ Penn, Wharton Sch, Dept Stat, Philadelphia, PA 19104 USA
基金
美国国家科学基金会;
关键词
Covariance matrix; Frobenius norm; Gaussian graphical model; Precision matrix; Rate of convergence; Spectral norm; VARIABLE SELECTION; COVARIANCE; CONVERGENCE; LIKELIHOOD; RECOVERY; RATES; MODEL;
D O I
10.1198/jasa.2011.tm10155
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
This article proposes a constrained l(1) minimization method for estimating a sparse inverse covariance matrix based on a sample of n iid p-variate random variables. The resulting estimator is shown to have a number of desirable properties. In particular, the rate of convergence between the estimator and the true s-sparse precision matrix under the spectral norm is s root logp/n when the population distribution has either exponential-type tails or polynomial-type tails. We present convergence rates under the elementwise l(infinity) norm and Frobenius norm. In addition, we consider graphical model selection. The procedure is easily implemented by linear programming. Numerical performance of the estimator is investigated using both simulated and real data. In particular, the procedure is applied to analyze a breast cancer dataset and is found to perform favorably compared with existing methods.
引用
收藏
页码:594 / 607
页数:14
相关论文
共 50 条
  • [31] A perturbation analysis of block-sparse compressed sensing via mixed l2/l1 minimization
    Zhang, Jing
    Wang, Jianjun
    Wang, Wendong
    INTERNATIONAL JOURNAL OF WAVELETS MULTIRESOLUTION AND INFORMATION PROCESSING, 2016, 14 (04)
  • [32] On the Theoretical Guarantees for Parameter Estimation of Gaussian Random Field Models: A Sparse Precision Matrix Approach
    Tajbakhsh, Sam Davanloo
    Aybat, Necdet Serhat
    Del Castillo, Enrique
    JOURNAL OF MACHINE LEARNING RESEARCH, 2020, 21
  • [33] SOLUTION UNIQUENESS OF CONVEX PIECEWISE AFFINE FUNCTIONS BASED OPTIMIZATION WITH APPLICATIONS TO CONSTRAINED l1 MINIMIZATION
    Mousavi, Seyedahmad
    Shen, Jinglai
    ESAIM-CONTROL OPTIMISATION AND CALCULUS OF VARIATIONS, 2019, 25
  • [34] A Max-Norm Constrained Minimization Approach to 1-Bit Matrix Completion
    Cai, Tony
    Zhou, Wen-Xin
    JOURNAL OF MACHINE LEARNING RESEARCH, 2013, 14 : 3619 - 3647
  • [35] A constrained optimization reformulation and a feasible descent direction method for L1/2 regularization
    Li, Dong-Hui
    Wu, Lei
    Sun, Zhe
    Zhang, Xiong-ji
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2014, 59 (1-2) : 263 - 284
  • [36] On first-order algorithms for l1/nuclear norm minimization
    Nesterov, Yurii
    Nemirovski, Arkadi
    ACTA NUMERICA, 2013, 22 : 509 - 575
  • [37] A fast ADMM algorithm for sparse precision matrix estimation using lasso penalized D-trace loss
    Zhu, Mingmin
    Jiang, Jiewei
    Gao, Weifeng
    EGYPTIAN INFORMATICS JOURNAL, 2024, 25
  • [38] An efficient GPU-parallel coordinate descent algorithm for sparse precision matrix estimation via scaled lasso
    Lee, Seunghwan
    Kim, Sang Cheol
    Yu, Donghyeon
    COMPUTATIONAL STATISTICS, 2023, 38 (01) : 217 - 242
  • [39] An improved modified cholesky decomposition approach for precision matrix estimation
    Kang, Xiaoning
    Deng, Xinwei
    JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2020, 90 (03) : 443 - 464
  • [40] L1 - βLq Minimization for Signal and Image Recovery
    Huo, Limei
    Chen, Wengu
    Ge, Huanmin
    Ng, Michael K.
    SIAM JOURNAL ON IMAGING SCIENCES, 2023, 16 (04) : 1886 - 1928