Rank-R FNN: A Tensor-Based Learning Model for High-Order Data Classification

被引:21
作者
Makantasis, Konstantinos [1 ]
Georgogiannis, Alexandros [2 ,3 ]
Voulodimos, Athanasios [4 ]
Georgoulas, Ioannis [5 ]
Doulamis, Anastasios [5 ]
Doulamis, Nikolaos [5 ]
机构
[1] Univ Malta, Inst Digital Games, MSD-2080 Msida, Malta
[2] Tech Univ Crete, Sch Elect & Comp Engn, Khania 73100, Greece
[3] DeepLab, Athens 11741, Greece
[4] Univ West Attica, Dept Informat & Comp Engn, Aigaleo 12243, Greece
[5] Natl Tech Univ Athens, Sch Rural & Surveying Engn, Athens 15773, Greece
基金
欧盟地平线“2020”;
关键词
High-order data processing; hyperspectral data classification; Rank-R FNN; tensor-based neural networks; DISCRIMINANT-ANALYSIS; REGRESSION;
D O I
10.1109/ACCESS.2021.3072973
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
An increasing number of emerging applications in data science and engineering are based on multidimensional and structurally rich data. The irregularities, however, of high-dimensional data often compromise the effectiveness of standard machine learning algorithms. We hereby propose the Rank- R Feedforward Neural Network (FNN), a tensor-based nonlinear learning model that imposes Canonical/Polyadic decomposition on its parameters, thereby offering two core advantages compared to typical machine learning methods. First, it handles inputs as multilinear arrays, bypassing the need for vectorization, and can thus fully exploit the structural information along every data dimension. Moreover, the number of the model's trainable parameters is substantially reduced, making it very efficient for small sample setting problems. We establish the universal approximation and learnability properties of Rank-R FNN, and we validate its performance on real-world hyperspectral datasets. Experimental evaluations show that Rank-R FNN is a computationally inexpensive alternative of ordinary FNN that achieves state-of-the-art performance on higher-order tensor data.
引用
收藏
页码:58609 / 58620
页数:12
相关论文
共 38 条
[1]   LEARNABILITY AND THE VAPNIK-CHERVONENKIS DIMENSION [J].
BLUMER, A ;
EHRENFEUCHT, A ;
HAUSSLER, D ;
WARMUTH, MK .
JOURNAL OF THE ACM, 1989, 36 (04) :929-965
[2]   Kernel-based methods for hyperspectral image classification [J].
Camps-Valls, G ;
Bruzzone, L .
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2005, 43 (06) :1351-1362
[3]   Deep Learning-Based Classification of Hyperspectral Data [J].
Chen, Yushi ;
Lin, Zhouhan ;
Zhao, Xing ;
Wang, Gang ;
Gu, Yanfeng .
IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2014, 7 (06) :2094-2107
[4]  
Chu W., 2009, P AISTATS, P89
[5]  
Cichocki A, 2016, FOUND TRENDS MACH LE, V9, P431, DOI [10.1561/2200000059, 10.1561/2200000067]
[6]  
Cybenko G., 1989, Mathematics of Control, Signals, and Systems, V2, P303, DOI 10.1007/BF02551274
[7]   5D MODELLING: AN EFFICIENT APPROACH FOR CREATING SPATIOTEMPORAL PREDICTIVE 3D MAPS OF LARGE-SCALE CULTURAL RESOURCES [J].
Doulamis, Anastasios ;
Doulamis, Nikolaos ;
Ioannidis, Charalabos ;
Chrysouli, Christina ;
Grammalidis, Nikos ;
Dimitropoulos, Kosmas ;
Potsiou, Chryssy ;
Stathopoulou, Elisavet Konstantina ;
Ioannides, Marinos .
25TH INTERNATIONAL CIPA SYMPOSIUM 2015, 2015, :61-68
[8]  
Garipov T., 2016, Ultimate tensorization: Compressing convolutional and fc layers alike
[9]   MULTILINEAR TENSOR REGRESSION FOR LONGITUDINAL RELATIONAL DATA [J].
Hoff, Peter D. .
Annals of Applied Statistics, 2015, 9 (03) :1169-1193
[10]   UNIVERSAL APPROXIMATION OF AN UNKNOWN MAPPING AND ITS DERIVATIVES USING MULTILAYER FEEDFORWARD NETWORKS [J].
HORNIK, K ;
STINCHCOMBE, M ;
WHITE, H .
NEURAL NETWORKS, 1990, 3 (05) :551-560