In this paper, we develop a nested chi-squared likelihood ratio test for selecting among shrinkage-regularized covariance estimators for background modeling in hyperspectral imagery. Critical to many target and anomaly detection algorithms is the modeling and estimation of the underlying background signal present in the data. This is especially important in hyperspectral imagery, wherein the signals of interest often represent only a small fraction of the observed variance, for example when targets of interest are subpixel. This background is often modeled by a local or global multivariate Gaussian distribution, which necessitates estimating a covariance matrix. Maximum likelihood estimation of this matrix often overfits the available data, particularly in high dimensional settings such as hyperspectral imagery, yielding subpar detection results. Instead, shrinkage estimators are often used to regularize the estimate. Shrinkage estimators linearly combine the overfit covariance with an underfit shrinkage target, thereby producing a well-fit estimator. These estimators introduce a shrinkage parameter, which controls the relative weighting between the covariance and shrinkage target. There have been many proposed methods for setting this parameter, but comparing these methods and shrinkage values is often performed with a cross-validation procedure, which can be computationally expensive and highly sample inefficient. Drawing from Bayesian regression methods, we compute the degrees of freedom of a covariance estimate using eigenvalue thresholding and employ a nested chi-squared likelihood ratio test for comparing estimators. This likelihood ratio test requires no cross-validation procedure and enables direct comparison of different shrinkage estimates, which is computationally efficient.