Online inference with debiased stochastic gradient descent

被引:6
作者
Han, Ruijian [1 ]
Luo, Lan [2 ]
Lin, Yuanyuan [3 ]
Huang, Jian [1 ]
机构
[1] Hong Kong Polytech Univ, Dept Appl Math, Kowloon, Hung Hom, 11 Yuk Choi Rd, Hong Kong 999077, Peoples R China
[2] Rutgers Sch Publ Hlth, Dept Biostat & Epidemiol, 683 Hoes Lane West, Piscataway, NJ 08854 USA
[3] Chinese Univ Hong Kong, Dept Stat, Shatin, Cent Ave, Hong Kong 999077, Peoples R China
基金
中国国家自然科学基金;
关键词
Confidence interval; High-dimensional statistics; Online learning; Stochastic gradient descent; CONFIDENCE-INTERVALS; APPROXIMATION; PARAMETERS;
D O I
10.1093/biomet/asad046
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
We propose a debiased stochastic gradient descent algorithm for online statistical inference with high-dimensional data. Our approach combines the debiasing technique developed in high-dimensional statistics with the stochastic gradient descent algorithm. It can be used to construct confidence intervals efficiently in an online fashion. Our proposed algorithm has several appealing aspects: as a one-pass algorithm, it reduces the time complexity; in addition, each update step requires only the current data together with the previous estimate, which reduces the space complexity. We establish the asymptotic normality of the proposed estimator under mild conditions on the sparsity level of the parameter and the data distribution. Numerical experiments demonstrate that the proposed debiased stochastic gradient descent algorithm attains nominal coverage probability. Furthermore, we illustrate our method with analysis of a high-dimensional text dataset.
引用
收藏
页码:93 / 108
页数:16
相关论文
共 50 条
  • [31] Batched Stochastic Gradient Descent with Weighted Sampling
    Needell, Deanna
    Ward, Rachel
    APPROXIMATION THEORY XV, 2017, 201 : 279 - 306
  • [32] Towards stability and optimality in stochastic gradient descent
    Toulis, Panos
    Tran, Dustin
    Airoldi, Edoardo M.
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 51, 2016, 51 : 1290 - 1298
  • [33] Predicting Throughput of Distributed Stochastic Gradient Descent
    Li, Zhuojin
    Paolieri, Marco
    Golubchik, Leana
    Lin, Sung-Han
    Yan, Wumo
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2022, 33 (11) : 2900 - 2912
  • [34] Convergent Stochastic Almost Natural Gradient Descent
    Sanchez-Lopez, Borja
    Cerquides, Jesus
    ARTIFICIAL INTELLIGENCE RESEARCH AND DEVELOPMENT, 2019, 319 : 54 - 63
  • [35] On the diffusion approximation of nonconvex stochastic gradient descent
    Hu, Wenqing
    Li, Chris Junchi
    Li, Lei
    Liu, Jian-Guo
    ANNALS OF MATHEMATICAL SCIENCES AND APPLICATIONS, 2019, 4 (01) : 3 - 32
  • [36] Stochastic Gradient Descent Support Vector Clustering
    Tung Pham
    Hang Dang
    Trung Le
    Hoang-Thai Le
    PROCEEDINGS OF 2015 2ND NATIONAL FOUNDATION FOR SCIENCE AND TECHNOLOGY DEVELOPMENT CONFERENCE ON INFORMATION AND COMPUTER SCIENCE NICS 2015, 2015, : 88 - 93
  • [37] Fractional stochastic gradient descent for recommender systems
    Khan, Zeshan Aslam
    Chaudhary, Naveed Ishtiaq
    Zubair, Syed
    ELECTRONIC MARKETS, 2019, 29 (02) : 275 - 285
  • [38] Fractional stochastic gradient descent for recommender systems
    Zeshan Aslam Khan
    Naveed Ishtiaq Chaudhary
    Syed Zubair
    Electronic Markets, 2019, 29 : 275 - 285
  • [39] Bolstering stochastic gradient descent with model building
    Birbil, S. Ilker
    Martin, Ozgur
    Onay, Gonenc
    Oztoprak, Figen
    TOP, 2024, 32 (03) : 517 - 536
  • [40] Error Analysis of Stochastic Gradient Descent Ranking
    Chen, Hong
    Tang, Yi
    Li, Luoqing
    Yuan, Yuan
    Li, Xuelong
    Tang, Yuanyan
    IEEE TRANSACTIONS ON CYBERNETICS, 2013, 43 (03) : 898 - 909