共 50 条
Block-enhanced precision matrix estimation for large-scale datasets
被引:5
作者:
Eftekhari, Aryan
[1
]
Pasadakis, Dimosthenis
[1
]
Bollhoefer, Matthias
[2
]
Scheidegger, Simon
[3
]
Schenk, Olaf
[1
]
机构:
[1] Univ Svizzera Italiana, Fac Informat, Inst Comp, Lugano, Switzerland
[2] TU Braunschweig, Inst Numer Anal, Braunschweig, Germany
[3] Univ Lausanne, Dept Econ, Lausanne, Switzerland
基金:
瑞士国家科学基金会;
关键词:
Covariance matrices;
Graphical model;
Optimization;
Gaussian Markov random field;
Machine learning application;
SPARSE;
SELECTION;
PARALLEL;
SOLVER;
MODEL;
D O I:
10.1016/j.jocs.2021.101389
中图分类号:
TP39 [计算机的应用];
学科分类号:
081203 ;
0835 ;
摘要:
The l(1)-regularized Gaussian maximum likelihood method is a common approach for sparse precision matrix estimation, but one that poses a computational challenge for high-dimensional datasets. We present a novel l(1)-regularized maximum likelihood method for performant large-scale sparse precision matrix estimation utilizing the block structures in the underlying computations. We identify the computational bottlenecks and contribute a block coordinate descent update as well as a block approximate matrix inversion routine, which is then parallelized using a shared-memory scheme. We demonstrate the effectiveness, accuracy, and performance of these algorithms. Our numerical examples and comparative results with various modern open-source packages reveal that these precision matrix estimation methods can accelerate the computation of covariance matrices by two to three orders of magnitude, while keeping memory requirements modest. Furthermore, we conduct large-scale case studies for applications from finance and medicine with several thousand random variables to demonstrate applicability for real-world datasets.
引用
收藏
页数:13
相关论文
共 50 条