Modelling additive extremile regression by iteratively penalized least asymmetric weighted squares and gradient descent boosting

被引:0
|
作者
Geng, Ziwen [1 ,2 ]
机构
[1] Renmin Univ China, Ctr Appl Stat, Beijing 100872, Peoples R China
[2] Renmin Univ China, Sch Stat, Beijing 100872, Peoples R China
关键词
Extremile regression; additive model; iteratively penalized least asymmetric weighted squares; gradient descent boosting; tropical cyclone intensity of North Atlantic; QUANTILES; RISK; PREDICTION; EXPECTILES;
D O I
10.1080/02331888.2024.2348077
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Quantile regression has emerged as one of the standard tools for regression analysis that enables a proper assessment of the complete conditional distribution of responses. This article considers a valuable alternative class to quantiles, called extremiles. Extremiles bear much better than quantiles the burden of representing an alert risk measure to the magnitude of infrequent catastrophic losses. Additive regression model has concentrated on making the regression structure more flexible by including nonlinear effects of continuous covariates and interaction effects. As a consequence, additive extremile regression based on minimizing an asymmetrically weighted sum of squared residuals is introduced. Different estimation procedures are presented including iteratively penalized least asymmetric weighted squares and gradient descent boosting. The properties of these procedures are investigated in a simulation study and an analysis of tropical cyclone intensity of the North Atlantic which reveals the function of variable selection by modelling additive extremile regression simultaneously.
引用
收藏
页码:576 / 595
页数:20
相关论文
共 20 条
  • [1] On Projected Stochastic Gradient Descent Algorithm with Weighted Averaging for Least Squares Regression
    Cohen, Kobi
    Nedic, Angelia
    Srikant, R.
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2017, 62 (11) : 5974 - 5981
  • [2] On Projected Stochastic Gradient Descent Algorithm with Weighted Averaging for Least Squares Regression
    Cohen, Kobi
    Nedic, Angelia
    Srikant, R.
    2016 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING PROCEEDINGS, 2016, : 2314 - 2318
  • [3] ROBUST REGRESSION USING ITERATIVELY RE-WEIGHTED LEAST-SQUARES
    HOLLAND, PW
    WELSCH, RE
    COMMUNICATIONS IN STATISTICS PART A-THEORY AND METHODS, 1977, 6 (09): : 813 - 827
  • [4] Conjugate gradient acceleration of iteratively re-weighted least squares methods
    Fornasier, Massimo
    Peter, Steffen
    Rauhut, Holger
    Worm, Stephan
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2016, 65 (01) : 205 - 259
  • [5] Conjugate gradient acceleration of iteratively re-weighted least squares methods
    Massimo Fornasier
    Steffen Peter
    Holger Rauhut
    Stephan Worm
    Computational Optimization and Applications, 2016, 65 : 205 - 259
  • [6] Fast Gradient Descent for Drifting Least Squares Regression, with Application to Bandits
    Korda, Nathan
    Prashanth, L. A.
    Munos, Remi
    PROCEEDINGS OF THE TWENTY-NINTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2015, : 2708 - 2714
  • [7] Efficient Privacy-Preserving Logistic Regression with Iteratively Re-weighted Least Squares
    Kikuchi, Hiroaki
    Yasunaga, Hideo
    Matsui, Hiroki
    Fan, Chun-I
    2016 11TH ASIA JOINT CONFERENCE ON INFORMATION SECURITY (ASIAJCIS), 2016, : 48 - 54
  • [8] Tree Based Orthogonal Least Squares Regression with Repeated Weighted Boosting Search
    Fu, Lihua
    Li, Hongwei
    Zhang, Meng
    JOURNAL OF COMPUTERS, 2012, 7 (01) : 187 - 195
  • [9] A gradient descent boosting spectrum modeling method based on back interval partial least squares
    Ren, Dong
    Qu, Fangfang
    Lv, Ke
    Zhang, Zhong
    Xu, Honglei
    Wang, Xiangyu
    NEUROCOMPUTING, 2016, 171 : 1038 - 1046
  • [10] Distance regression by Gauss-Newton-type methods and iteratively re-weighted least-squares
    Aigner, Martin
    Juettler, Bert
    COMPUTING, 2009, 86 (2-3) : 73 - 87