ELECTRONIC JOURNAL OF STATISTICS
|
2023年
/
17卷
/
02期
关键词:
Bahadur expansions in L-1;
Hadamard differentiability;
kernel density estimation;
ROOT-N CONSISTENT;
NONPARAMETRIC DENSITY;
LINEAR-PROCESSES;
CONVERGENCE;
D O I:
10.1214/23-EJS2166
中图分类号:
O21 [概率论与数理统计];
C8 [统计学];
学科分类号:
020208 ;
070103 ;
0714 ;
摘要:
This paper studies a class of plug-in estimators of the stationary density of an autoregressive model with autoregression parameter 0 < g < 1. These use two types of estimator of the innovation density, a standard kernel estimator and a weighted kernel estimator with weights chosen to mimic the condition that the innovation density has mean zero. Bahadur expansions are obtained for this class of estimators in L-1, the space of integrable functions. These stochastic expansions establish root -n consistency in the L-1-norm. It is shown that the density estimators based on the weighted kernel estimators are asymptotically efficient if an asymptotically efficient estimator of the autoregression parameter is used. Here asymptotic efficiency is understood in the sense of the Hajek-Le Cam convolution theorem.