Likelihood-based and Bayesian methods for Tweedie compound Poisson linear mixed models

被引:138
作者
Zhang, Yanwei [1 ]
机构
[1] Univ So Calif, Dept Mkt, Marshall Business Sch, Los Angeles, CA 90089 USA
关键词
Adaptive Gauss-Hermite quadrature; Extended quasi-likelihood; Laplace approximation; Monte Carlo EM; Maximum likelihood estimation; Mixed models; Penalized quasi-likelihood; Tweedie compound Poisson distribution; CARLO EM ALGORITHM; PARAMETER ORTHOGONALITY; DENSITIES; ADJUSTMENT; QUADRATURE; INFERENCE;
D O I
10.1007/s11222-012-9343-7
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
The Tweedie compound Poisson distribution is a subclass of the exponential dispersion family with a power variance function, in which the value of the power index lies in the interval (1,2). It is well known that the Tweedie compound Poisson density function is not analytically tractable, and numerical procedures that allow the density to be accurately and fast evaluated did not appear until fairly recently. Unsurprisingly, there has been little statistical literature devoted to full maximum likelihood inference for Tweedie compound Poisson mixed models. To date, the focus has been on estimation methods in the quasi-likelihood framework. Further, Tweedie compound Poisson mixed models involve an unknown variance function, which has a significant impact on hypothesis tests and predictive uncertainty measures. The estimation of the unknown variance function is thus of independent interest in many applications. However, quasi-likelihood-based methods are not well suited to this task. This paper presents several likelihood-based inferential methods for the Tweedie compound Poisson mixed model that enable estimation of the variance function from the data. These algorithms include the likelihood approximation method, in which both the integral over the random effects and the compound Poisson density function are evaluated numerically; and the latent variable approach, in which maximum likelihood estimation is carried out via the Monte Carlo EM algorithm, without the need for approximating the density function. In addition, we derive the corresponding Markov Chain Monte Carlo algorithm for a Bayesian formulation of the mixed model. We demonstrate the use of the various methods through a numerical example, and conduct an array of simulation studies to evaluate the statistical properties of the proposed estimators.
引用
收藏
页码:743 / 757
页数:15
相关论文
共 32 条
[1]  
[Anonymous], 2003, Bayesian Data Analysis
[2]  
[Anonymous], 2004, Springer Texts in Statistics
[3]  
[Anonymous], 1978, WILEY SERIES PROBABI
[4]  
[Anonymous], 1983, Generalized Linear Models
[5]  
Bates D., 2012, J STAT SOFT IN PRESS
[6]   Maximizing generalized linear mixed model likelihoods with an automated Monte Carlo EM algorithm [J].
Booth, JG ;
Hobert, JP .
JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 1999, 61 :265-285
[7]   APPROXIMATE INFERENCE IN GENERALIZED LINEAR MIXED MODELS [J].
BRESLOW, NE ;
CLAYTON, DG .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 1993, 88 (421) :9-25
[8]   A comparison of Bayesian and likelihood-based methods for fitting multilevel models [J].
Browne, William J. ;
Draper, David .
BAYESIAN ANALYSIS, 2006, 1 (03) :473-513
[9]  
COX DR, 1987, J ROY STAT SOC B MET, V49, P1
[10]   VARIANCE FUNCTION ESTIMATION [J].
DAVIDIAN, M ;
CARROLL, RJ .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 1987, 82 (400) :1079-1091