Classification of Functional Data: A Comparative Study

被引:0
作者
Ramos-Carreno, Carlos [1 ]
Torrecilla, Jose Luis [2 ]
Suarez, Alberto [1 ]
机构
[1] Univ Autonoma Madrid, Dept Comp Sci, Madrid, Spain
[2] Univ Autonoma Madrid, Dept Math, Madrid, Spain
来源
2022 21ST IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS, ICMLA | 2022年
关键词
functional data analysis; classification; Mahalanobis distance; functional k-NN; DEPTH;
D O I
10.1109/ICMLA55696.2022.00143
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In functional classification problems the data available for learning are characterized by functions, rather than vectors of attributes. In consequence, multivariate classifiers need to be adapted, and new types of classifiers designed to take into account the special characteristics of these types of data. In this work, an empirical evaluation of different classification methods is carried out using a variety of functional classification problems from different areas of application. The classifiers considered include nearest centroids with functional means as class prototypes and functional distances, standard multivariate classifiers used in combination with a variable selection method, classifiers based on the notion of functional depth, a functional version of k-nearest neighbors (k-NN), and random forest. From the results of this comparative study one concludes that random forest is among the best off-the-shelf classifiers not only for multivariate but also for functional classification problems. The variable selection method used in combination with a quadratic discriminant has fairly good overall accuracy using only a small set of impact points. This dimensionality reduction leads to improvements both in efficiency and interpretability. Finally, a functional version of k-NN that uses the alpha-Mahalanobis distance exhibits consistently good predictive performance in all the problems considered. This robustness makes k-NN a good benchmark for functional classification.
引用
收藏
页码:866 / 871
页数:6
相关论文
共 24 条
[1]  
Baillo A., 2010, OXFORD HDB FUNCTIONA
[2]  
Berrendero JR, 2020, J MACH LEARN RES, V21
[3]   On the Use of Reproducing Kernel Hilbert Spaces in Functional Classification [J].
Berrendero, Jose R. ;
Cuevas, Antonio ;
Torrecilla, Jose L. .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2018, 113 (523) :1210-1218
[4]   Random forests [J].
Breiman, L .
MACHINE LEARNING, 2001, 45 (01) :5-32
[5]  
Bureau of Meteorology Australia, 1992, CISL RDA, DOI 10.5065/7V14-A428
[6]  
Cuesta-Albertos JA, 2017, TEST-SPAIN, V26, P119, DOI 10.1007/s11749-016-0502-6
[7]   Robust estimation and classification for functional data via projection-based depth notions [J].
Cuevas, Antonio ;
Febrero, Manuel ;
Fraiman, Ricardo .
COMPUTATIONAL STATISTICS, 2007, 22 (03) :481-496
[8]   A partial overview of the theory of statistics with functional data [J].
Cuevas, Antonio .
JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2014, 147 :1-23
[9]   Achieving near perfect classification for functional data [J].
Delaigle, Aurore ;
Hall, Peter .
JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 2012, 74 :267-286
[10]  
Demsar J, 2006, J MACH LEARN RES, V7, P1