DC algorithm for estimation of sparse Gaussian graphical models

被引:0
|
作者
Shiratori, Tomokaze [1 ]
Takano, Yuichi [2 ]
机构
[1] Univ Tsukuba, Grad Sch Sci & Technol, Tsukuba, Ibaraki, Japan
[2] Univ Tsukuba, Inst Syst & Informat Engn, Tsukuba, Ibaraki, Japan
来源
PLOS ONE | 2024年 / 19卷 / 12期
关键词
VARIABLE SELECTION; MATRIX ESTIMATION; ADAPTIVE LASSO; COVARIANCE; REGRESSION; REGULARIZATION; SHRINKAGE;
D O I
10.1371/journal.pone.0315740
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Sparse estimation of a Gaussian graphical model (GGM) is an important technique for making relationships between observed variables more interpretable. Various methods have been proposed for sparse GGM estimation, including the graphical lasso that uses the & ell;1 norm regularization term, and other methods that use nonconvex regularization terms. Most of these methods approximate the & ell;0 (pseudo) norm by more tractable functions; however, to estimate more accurate solutions, it is preferable to directly use the & ell;0 norm for counting the number of nonzero elements. To this end, we focus on sparse estimation of GGM with the cardinality constraint based on the & ell;0 norm. Specifically, we convert the cardinality constraint into an equivalent constraint based on the largest-K norm, and reformulate the resultant constrained optimization problem into an unconstrained penalty form with a DC (difference of convex functions) representation. To solve this problem efficiently, we design a DC algorithm in which the graphical lasso algorithm is repeatedly executed to solve convex optimization subproblems. Experimental results using two synthetic datasets show that our method achieves results that are comparable to or better than conventional methods for sparse GGM estimation. Our method is particularly advantageous for selecting true edges when cross-validation is used to determine the number of edges. Moreover, our DC algorithm converges within a practical time frame compared to the graphical lasso.
引用
收藏
页数:23
相关论文
共 50 条
  • [1] Estimation of Sparse Gaussian Graphical Models with Hidden Clustering Structure
    Lin, Meixia
    Sun, Defeng
    Toh, Kim-Chuan
    Wang, Chengjing
    JOURNAL OF MACHINE LEARNING RESEARCH, 2024, 25 : 1 - 36
  • [2] Sparse Gaussian Graphical Models for Speech Recognition
    Bell, Peter
    King, Simon
    INTERSPEECH 2007: 8TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION, VOLS 1-4, 2007, : 1545 - 1548
  • [3] Edge detection in sparse Gaussian graphical models
    Luo, Shan
    Chen, Zehua
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2014, 70 : 138 - 152
  • [4] Graphical models for sparse data: Graphical Gaussian models with vertex and edge symmetries
    Hojsgaard, Soren
    COMPSTAT 2008: PROCEEDINGS IN COMPUTATIONAL STATISTICS, 2008, : 105 - 116
  • [5] Multiple Gaussian graphical estimation with jointly sparse penalty
    Tao, Qinghua
    Huang, Xiaolin
    Wang, Shuning
    Xi, Xiangming
    Li, Li
    SIGNAL PROCESSING, 2016, 128 : 88 - 97
  • [6] Learning Sparse Gaussian Graphical Models with Overlapping Blocks
    Hosseini, Mohammad Javad
    Lee, Su-In
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [7] Inferring sparse Gaussian graphical models with latent structure
    Ambroise, Christophe
    Chiquet, Julien
    Matias, Catherine
    ELECTRONIC JOURNAL OF STATISTICS, 2009, 3 : 205 - 238
  • [8] Bayesian Structure Learning in Sparse Gaussian Graphical Models
    Mohammadi, A.
    Wit, E. C.
    BAYESIAN ANALYSIS, 2015, 10 (01): : 109 - 138
  • [9] Fitting very large sparse Gaussian graphical models
    Kiiveri, Harri
    de Hoog, Frank
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2012, 56 (09) : 2626 - 2636
  • [10] The cluster graphical lasso for improved estimation of Gaussian graphical models
    Tan, Kean Ming
    Witten, Daniela
    Shojaie, Ali
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2015, 85 : 23 - 36