Online inference with debiased stochastic gradient descent

被引:6
|
作者
Han, Ruijian [1 ]
Luo, Lan [2 ]
Lin, Yuanyuan [3 ]
Huang, Jian [1 ]
机构
[1] Hong Kong Polytech Univ, Dept Appl Math, Kowloon, Hung Hom, 11 Yuk Choi Rd, Hong Kong 999077, Peoples R China
[2] Rutgers Sch Publ Hlth, Dept Biostat & Epidemiol, 683 Hoes Lane West, Piscataway, NJ 08854 USA
[3] Chinese Univ Hong Kong, Dept Stat, Shatin, Cent Ave, Hong Kong 999077, Peoples R China
基金
中国国家自然科学基金;
关键词
Confidence interval; High-dimensional statistics; Online learning; Stochastic gradient descent; CONFIDENCE-INTERVALS; APPROXIMATION; PARAMETERS;
D O I
10.1093/biomet/asad046
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
We propose a debiased stochastic gradient descent algorithm for online statistical inference with high-dimensional data. Our approach combines the debiasing technique developed in high-dimensional statistics with the stochastic gradient descent algorithm. It can be used to construct confidence intervals efficiently in an online fashion. Our proposed algorithm has several appealing aspects: as a one-pass algorithm, it reduces the time complexity; in addition, each update step requires only the current data together with the previous estimate, which reduces the space complexity. We establish the asymptotic normality of the proposed estimator under mild conditions on the sparsity level of the parameter and the data distribution. Numerical experiments demonstrate that the proposed debiased stochastic gradient descent algorithm attains nominal coverage probability. Furthermore, we illustrate our method with analysis of a high-dimensional text dataset.
引用
收藏
页码:93 / 108
页数:16
相关论文
共 50 条
  • [21] Stochastic Reweighted Gradient Descent
    El Hanchi, Ayoub
    Stephens, David A.
    Maddison, Chris J.
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [22] Stochastic gradient descent tricks
    Bottou, Léon
    Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2012, 7700 LECTURE NO : 421 - 436
  • [23] Byzantine Stochastic Gradient Descent
    Alistarh, Dan
    Allen-Zhu, Zeyuan
    Li, Jerry
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [24] Online debiased lasso estimation and inference for heterogenous updating regressions
    Mi, Yajie
    Wang, Lei
    JOURNAL OF THE KOREAN STATISTICAL SOCIETY, 2024, 53 (04) : 1049 - 1090
  • [25] LEARNING BY ONLINE GRADIENT DESCENT
    BIEHL, M
    SCHWARZE, H
    JOURNAL OF PHYSICS A-MATHEMATICAL AND GENERAL, 1995, 28 (03): : 643 - 656
  • [26] Stochastic gradient descent performs variational inference, converges to limit cycles for deep networks
    Chaudhari, Pratik
    Soatto, Stefano
    2018 INFORMATION THEORY AND APPLICATIONS WORKSHOP (ITA), 2018,
  • [27] Online Localization with Imprecise Floor Space Maps using Stochastic Gradient Descent
    Li, Zhikai
    Ang, Marcelo H.
    Rus, Daniela
    2020 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2020, : 8571 - 8578
  • [28] EFFORT ALLOCATION AND STATISTICAL INFERENCE FOR 1-DIMENSIONAL MULTISTART STOCHASTIC GRADIENT DESCENT
    Toscano-Palmerin, Saul
    Frazier, Peter
    2018 WINTER SIMULATION CONFERENCE (WSC), 2018, : 1850 - 1861
  • [29] Online Projected Gradient Descent for Stochastic Optimization With Decision-Dependent Distributions
    Wood, Killian
    Bianchin, Gianluca
    Dall'Anese, Emiliano
    IEEE CONTROL SYSTEMS LETTERS, 2022, 6 : 1646 - 1651
  • [30] Opposite Online Learning via Sequentially Integrated Stochastic Gradient Descent Estimators
    Cui, Wenhai
    Ji, Xiaoting
    Kong, Linglong
    Yan, Xiaodong
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 6, 2023, : 7270 - 7278