Rate of divergence of the nonparametric likelihood ratio test for Gaussian mixtures

被引:1
|
作者
Jiang, Wenhua [1 ,2 ]
Zhang, Cun-Hui [3 ]
机构
[1] Fudan Univ, Sch Math Sci, 220 Handan Rd, Shanghai 200433, Peoples R China
[2] Shanghai Ctr Math Sci, 220 Handan Rd, Shanghai 200433, Peoples R China
[3] Rutgers State Univ, Dept Stat & Biostat, 110 Frelinghuysen Rd, Piscataway, NJ 08854 USA
关键词
Gaussian mixtures; Hermite polynomials; likelihood ratio test; rate of divergence; two-component mixtures; MAXIMUM-LIKELIHOOD; BAYES ESTIMATION; CONVERGENCE; ASYMPTOTICS;
D O I
10.3150/18-BEJ1094
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We study a nonparametric likelihood ratio test (NPLRT) for Gaussian mixtures. It is based on the nonparametric maximum likelihood estimator in the context of demixing. The test concerns if a random sample is from the standard normal distribution. We consider mixing distributions of unbounded support for alternative hypothesis. We prove that the divergence rate of the NPLRT under the null is bounded by log n, provided that the support range of the mixing distribution increases no faster than (log n/log 9)(1/2). We prove that the rate of root log n is a lower bound for the divergence rate if the support range increases no slower than the order of root log n. Implications of the upper bound for the rate of divergence are discussed.
引用
收藏
页码:3400 / 3420
页数:21
相关论文
共 50 条