Statistical Optimality of Divide and Conquer Kernel-based Functional Linear Regression

被引:0
作者
Liu, Jiading [1 ]
Shi, Lei [1 ,2 ,3 ]
机构
[1] Fudan Univ, Sch Math Sci, Shanghai 200433, Peoples R China
[2] Fudan Univ, Shanghai Key Lab Contemporary Appl Math, Shanghai 200433, Peoples R China
[3] Shanghai Artificial Intelligence Lab, 701 Yunjin Rd, Shanghai 200232, Peoples R China
基金
中国国家自然科学基金;
关键词
functional linear regression; reproducing kernel Hilbert space; divide-and- conquer estimator; model misspecification; mini-max optimal rates; MINIMAX; RATES; CONSISTENCY; PREDICTION;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Previous analysis of regularized functional linear regression in a reproducing kernel Hilbert space (RKHS) typically requires the target function to be contained in this kernel space. This paper studies the convergence performance of divide-and-conquer estimators in the scenario that the target function does not necessarily reside in the underlying RKHS. As a decomposition-based scalable approach, the divide-and-conquer estimators of functional linear regression can substantially reduce the algorithmic complexities in time and memory. We develop an integral operator approach to establish sharp finite sample upper bounds for prediction with divide-and-conquer estimators under various regularity conditions of explanatory variables and target function. We also prove the asymptotic optimality of the derived rates by building the mini -max lower bounds. Finally, we consider the convergence of noiseless estimators and show that the rates can be arbitrarily fast under mild conditions.
引用
收藏
页码:1 / 56
页数:56
相关论文
共 54 条
[11]   Functional linear model [J].
Cardot, H ;
Ferraty, F ;
Sarda, P .
STATISTICS & PROBABILITY LETTERS, 1999, 45 (01) :11-22
[12]   Online gradient descent algorithms for functional data learning [J].
Chen, Xiaming ;
Tang, Bohao ;
Fan, Jun ;
Guo, Xin .
JOURNAL OF COMPLEXITY, 2022, 70
[13]  
Conway J. B., 2000, A Course in Operator Theory
[14]  
Duchi J., 2016, Lecture notes for statistics 311/electrical engineering 377
[15]   Universal consistency and robustness of localized support vector machines [J].
Dumpert, Florian ;
Christmann, Andreas .
NEUROCOMPUTING, 2018, 315 :96-106
[16]   AN RKHS APPROACH TO ESTIMATE INDIVIDUALIZED TREATMENT RULES BASED ON FUNCTIONAL PREDICTORS [J].
Fan, Jun ;
Lv, Fusheng ;
Shi, Lei .
MATHEMATICAL FOUNDATIONS OF COMPUTING, 2019, 2 (02) :169-181
[17]  
Fischer S, 2020, J MACH LEARN RES, V21
[18]   Lower Bounds for the Minimax Risk Using f-Divergences, and Applications [J].
Guntuboyina, Adityanand .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2011, 57 (04) :2386-2399
[19]   Optimal prediction for kernel-based semi-functional linear regression [J].
Guo, Keli ;
Fan, Jun ;
Zhu, Lixing .
ANALYSIS AND APPLICATIONS, 2024, 22 (03) :467-505
[20]   Capacity dependent analysis for functional online learning algorithms [J].
Guo, Xin ;
Guo, Zheng-Chu ;
Shi, Lei .
APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2023, 67