Quantile Regression Neural Networks: A Bayesian Approach

被引:0
作者
S. R. Jantre
S. Bhattacharya
T. Maiti
机构
[1] Michigan State University,Department of Statistics and Probability
来源
Journal of Statistical Theory and Practice | 2021年 / 15卷
关键词
Asymmetric Laplace density; Bayesian quantile regression; Bracketing entropy; Feedforward neural network; Hellinger distance; MCMC; Posterior consistency; Sieve asymptotics;
D O I
暂无
中图分类号
学科分类号
摘要
This article introduces a Bayesian neural network estimation method for quantile regression assuming an asymmetric Laplace distribution (ALD) for the response variable. It is shown that the posterior distribution for feedforward neural network quantile regression is asymptotically consistent under a misspecified ALD model. This consistency proof embeds the problem from density estimation domain and uses bounds on the bracketing entropy to derive the posterior consistency over Hellinger neighborhoods. This consistency result is shown in the setting where the number of hidden nodes grow with the sample size. The Bayesian implementation utilizes the normal-exponential mixture representation of the ALD density. The algorithm uses Markov chain Monte Carlo (MCMC) simulation technique - Gibbs sampling coupled with Metropolis–Hastings algorithm. We have addressed the issue of complexity associated with the afore-mentioned MCMC implementation in the context of chain convergence, choice of starting values, and step sizes. We have illustrated the proposed method with simulation studies and real data examples.
引用
收藏
相关论文
共 54 条
[1]  
Andrews DF(1974)Scale mixtures of normal distributions J R Stat Soc Ser B (Methodol) 36 99-102
[2]  
Mallows CL(2001)Non-Gaussian Ornstein-Uhlenbeck-based models and some of their uses in financial economics J R Stat Soc Ser B 63 167-241
[3]  
Barndorff-Nielsen OE(1999)The consistency of posterior distributions in nonparametric problems Ann Stat 10 536-561
[4]  
Shephard N(1991)Bayesian back-propagation Comp Syst 5 603-643
[5]  
Barron A(2018)Non-crossing nonlinear regression quantiles by monotone composite quantile regression neural network, with application to rainfall extremes Stoch Environ Res Risk Assess 32 3207-3225
[6]  
Schervish MJ(2007)A finite smoothing algorithm for quantile regression J Comput Graph Stat 16 136-164
[7]  
Wasserman L(1989)Approximation by superpositions of a sigmoidal function Math Controls Signals Syst 2 303-314
[8]  
Buntine WL(1989)On the approximate realization of continuous mappings by neural networks Neural Netw 2 183-192
[9]  
Weigend AS(1992)Inference from iterative simulation using multiple sequences Stat Sci 7 457-472
[10]  
Cannon AJ(2000)Noninformative Priors for One-Parameter Item Models J Stat Plan Infer 88 99-115