Functional data analysis using deep neural networks

被引:2
作者
Wang, Shuoyang [1 ]
Zhang, Wanyu [2 ]
Cao, Guanqun [3 ]
Huang, Yuan [2 ]
机构
[1] Univ Louisville, Dept Bioinformat & Biostat, Louisville, KY 40292 USA
[2] Yale Univ, Dept Biostat, New Haven, CT USA
[3] Michigan State Univ, Dept Stat & Probabil, E Lansing, MI USA
关键词
deep learning; functional data analysis; neural networks; NONPARAMETRIC REGRESSION; CONVERGENCE-RATES; CLASSIFICATION; DISCRIMINATION; APPROXIMATION; BOUNDS;
D O I
10.1002/wics.70001
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Functional data analysis is an evolving field focused on analyzing data that reveals insights into curves, surfaces, or entities within a continuous domain. This type of data is typically distinguished by the inherent dependence and smoothness observed within each data curve. Traditional functional data analysis approaches have predominantly relied on linear models, which, while foundational, often fall short in capturing the intricate, nonlinear relationships within the data. This paper seeks to bridge this gap by reviewing the integration of deep neural networks into functional data analysis. Deep neural networks present a transformative approach to navigating these complexities, excelling particularly in high-dimensional spaces and demonstrating unparalleled flexibility in managing diverse data constructs. This review aims to advance functional data regression, classification, and representation by integrating deep neural networks with functional data analysis, fostering a harmonious and synergistic union between these two fields. The remarkable ability of deep neural networks to adeptly navigate the intricate functional data highlights a wealth of opportunities for ongoing exploration and research across various interdisciplinary areas. This article is categorized under: Data: Types and Structure > Time Series, Stochastic Processes, and Functional Data Statistical Learning and Exploratory Methods of the Data Sciences > Deep Learning Statistical Learning and Exploratory Methods of the Data Sciences > Neural Networks
引用
收藏
页数:19
相关论文
共 66 条
[1]   High-dimensional dynamics of generalization error in neural networks [J].
Advani, Madhu S. ;
Saxe, Andrew M. ;
Sompolinsky, Haim .
NEURAL NETWORKS, 2020, 132 :428-446
[2]  
Ansuini A, 2019, ADV NEUR IN, V32
[3]   Functional Logistic Discrimination Via Regularized Basis Expansions [J].
Araki, Yuko ;
Konishi, Sadanori ;
Kawano, Shuichi ;
Matsui, Hidetoshi .
COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2009, 38 (16-17) :2944-2957
[4]   UNIVERSAL APPROXIMATION BOUNDS FOR SUPERPOSITIONS OF A SIGMOIDAL FUNCTION [J].
BARRON, AR .
IEEE TRANSACTIONS ON INFORMATION THEORY, 1993, 39 (03) :930-945
[5]  
BARRON AR, 1994, MACH LEARN, V14, P115, DOI 10.1007/BF00993164
[6]   ON DEEP LEARNING AS A REMEDY FOR THE CURSE OF DIMENSIONALITY IN NONPARAMETRIC REGRESSION [J].
Bauer, Benedikt ;
Kohler, Michael .
ANNALS OF STATISTICS, 2019, 47 (04) :2261-2285
[7]   Convergence rates of deep ReLU networks for multiclass classification [J].
Bos, Thijs ;
Schmidt-Hieber, Johannes .
ELECTRONIC JOURNAL OF STATISTICS, 2022, 16 (01) :2724-2773
[8]   Estimation and inference for functional linear regression models with partially varying regression coefficients [J].
Cao, Guanqun ;
Wang, Shuoyang ;
Wang, Lily .
STAT, 2020, 9 (01)
[9]   Nonparametric estimation of smoothed principal components analysis of sampled noisy functions [J].
Cardot, H .
JOURNAL OF NONPARAMETRIC STATISTICS, 2000, 12 (04) :503-538
[10]   METHODOLOGY AND THEORY FOR PARTIAL LEAST SQUARES APPLIED TO FUNCTIONAL DATA [J].
Delaigle, Aurore ;
Hall, Peter .
ANNALS OF STATISTICS, 2012, 40 (01) :322-352