On regression and classification with possibly missing response variables in the data

被引:0
作者
Mojirsheibani, Majid [1 ]
Pouliot, William [2 ]
Shakhbandaryan, Andre [1 ]
机构
[1] Calif State Univ Northridge, Dept Math, Northridge, CA 91330 USA
[2] Univ Birmingham, Dept Econ, Birmingham, England
基金
美国国家科学基金会;
关键词
Regression; Partially observed data; Kernel; Convergence; Classification; Margin condition; LINEAR-REGRESSION; CONVERGENCE; MARGIN; MODELS;
D O I
10.1007/s00184-023-00923-3
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
This paper considers the problem of kernel regression and classification with possibly unobservable response variables in the data, where the mechanism that causes the absence of information can depend on both predictors and the response variables. Our proposed approach involves two steps: First we construct a family of models (possibly infinite dimensional) indexed by the unknown parameter of the missing probability mechanism. In the second step, a search is carried out to find the empirically optimal member of an appropriate cover (or subclass) of the underlying family in the sense of minimizing the mean squared prediction error. The main focus of the paper is to look into some of the theoretical properties of these estimators. The issue of identifiability is also addressed. Our methods use a data-splitting approach which is quite easy to implement. We also derive exponential bounds on the performance of the resulting Destimators in terms of their deviations from the true regression curve in general L-p norms, where we allow the size of the cover or subclass to diverge as the sample size n increases. These bounds immediately yield various strong convergence results for the proposed estimators. As an application of our findings, we consider the problem of statistical classification based on the proposed regression estimators and also look into their rates of convergence under different settings. Although this work is mainly stated for kernel-type estimators, it can also be extended to other popular local-averaging methods such as nearest-neighbor and histogram estimators.
引用
收藏
页码:607 / 648
页数:42
相关论文
共 43 条
  • [1] Fast learning rates for plug-in classifiers
    Audibert, Jean-Yves
    Tsybakov, Alexandre B.
    [J]. ANNALS OF STATISTICS, 2007, 35 (02) : 608 - 633
  • [2] DENSITY-SENSITIVE SEMISUPERVISED INFERENCE
    Azizyan, Martin
    Singh, Aarti
    Wasserman, Larry
    [J]. ANNALS OF STATISTICS, 2013, 41 (02) : 751 - 771
  • [3] RANK-BASED ESTIMATING EQUATION WITH NON-IGNORABLE MISSING RESPONSES VIA EMPIRICAL LIKELIHOOD
    Bindele, Huybrechts F.
    Zhao, Yichuan
    [J]. STATISTICA SINICA, 2018, 28 (04) : 1787 - 1820
  • [4] Pseudo likelihood-based estimation and testing of missingness mechanism function in nonignorable missing data problems
    Chen, Xuerong
    Diao, Guoqing
    Qin, Jing
    [J]. SCANDINAVIAN JOURNAL OF STATISTICS, 2020, 47 (04) : 1377 - 1400
  • [5] AN EQUIVALENCE THEOREM FOR L1 CONVERGENCE OF THE KERNEL REGRESSION ESTIMATE
    DEVROYE, L
    KRZYZAK, A
    [J]. JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 1989, 23 (01) : 71 - 82
  • [6] Devroye L., 1996, PROBABILISTIC THEORY, V31, DOI DOI 10.1007/978-1-4612-0711-5
  • [7] Döring M, 2016, STUD COMPUT INTELL, V605, P71, DOI 10.1007/978-3-319-18781-5_5
  • [8] IMPUTATION-BASED ADJUSTED SCORE EQUATIONS IN GENERALIZED LINEAR MODELS WITH NONIGNORABLE MISSING COVARIATE VALUES
    Fang, Fang
    Zhao, Jiwei
    Shao, Jun
    [J]. STATISTICA SINICA, 2018, 28 (04) : 1677 - 1701
  • [9] Model checking for general linear regression with nonignorable missing response
    Guo, Xu
    Song, Lianlian
    Fang, Yun
    Zhu, Lixing
    [J]. COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2019, 138 : 1 - 12
  • [10] Gyorfi L., 2010, A Distribution-Free Theory of Nonparametric Regression, DOI DOI 10.1007/B97848