Convergence analysis of regularised Nyström method for functional linear regression

被引:0
作者
Gupta, Naveen [1 ]
Sivananthan, S. [1 ]
机构
[1] Indian Inst Technol Delhi, Dept Math, New Delhi, India
基金
新加坡国家研究基金会;
关键词
functional linear regression; reproducing kernel Hilbert space; Nystr & ouml; m subsampling; regularization; covariance operator; KERNEL CONJUGATE-GRADIENT;
D O I
10.1088/1361-6420/adbfb6
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
The functional linear regression model has been widely studied and utilized for dealing with functional predictors. In this paper, we study the Nystr & ouml;m subsampling method, a strategy used to tackle the computational complexities inherent in big data analytics, especially within the domain of functional linear regression model in the framework of reproducing kernel Hilbert space. By adopting a Nystr & ouml;m subsampling strategy, our aim is to mitigate the computational overhead associated with kernel methods, which often struggle to scale gracefully with dataset size. Specifically, we investigate a regularization-based approach combined with Nystr & ouml;m subsampling for functional linear regression model, effectively reducing the computational complexity from O(n3) to O(m2n), where n represents the size of the observed empirical dataset and m is the size of subsampled dataset. Notably, we establish that these methodologies will achieve optimal convergence rates, provided that the subsampling level is appropriately selected. We have also demonstrated numerical examples of Nystr & ouml;m subsampling in the reproducing kernel Hilbert space framework for the functional linear regression model.
引用
收藏
页数:19
相关论文
共 50 条
[31]   Functional linear regression after spline transformation [J].
Wang, Guochang ;
Lin, Nan ;
Zhang, Baoxue .
COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2012, 56 (03) :587-601
[32]   INFERENCE FOR GENERALIZED PARTIAL FUNCTIONAL LINEAR REGRESSION [J].
Li, Ting ;
Zhu, Zhongyi .
STATISTICA SINICA, 2020, 30 (03) :1379-1397
[33]   On prediction rate in partial functional linear regression [J].
Shin, Hyejin ;
Lee, Myung Hee .
JOURNAL OF MULTIVARIATE ANALYSIS, 2012, 103 (01) :93-106
[34]   SMOOTHING SPLINES ESTIMATORS FOR FUNCTIONAL LINEAR REGRESSION [J].
Crambes, Christophe ;
Kneip, Alois ;
Sarda, Pascal .
ANNALS OF STATISTICS, 2009, 37 (01) :35-72
[35]   Functional partially linear quantile regression model [J].
Lu, Ying ;
Du, Jiang ;
Sun, Zhimeng .
METRIKA, 2014, 77 (02) :317-332
[36]   Functional Linear Regression: Dependence and Error Contamination [J].
Chen, Cheng ;
Guo, Shaojun ;
Qiao, Xinghao .
JOURNAL OF BUSINESS & ECONOMIC STATISTICS, 2022, 40 (01) :444-457
[37]   Convergence rates of support vector machines regression for functional data [J].
Tong, Hongzhi .
JOURNAL OF COMPLEXITY, 2022, 69
[38]   Statistical Optimality of Divide and Conquer Kernel-based Functional Linear Regression [J].
Liu, Jiading ;
Shi, Lei .
JOURNAL OF MACHINE LEARNING RESEARCH, 2024, 25 :1-56
[39]   Functional linear regression for functional response via sparse basis selection [J].
Han, Kyunghee ;
Shin, Hyejin .
JOURNAL OF THE KOREAN STATISTICAL SOCIETY, 2015, 44 (03) :376-389
[40]   Estimation of the noise covariance operator in functional linear regression with functional outputs [J].
Crambes, Christophe ;
Hilgert, Nadine ;
Manrique, Tito .
STATISTICS & PROBABILITY LETTERS, 2016, 113 :7-15