Universal Approximation and the Topological Neural Network

被引:0
作者
Kouritzin, Michael A. [1 ]
Richard, Daniel [2 ]
机构
[1] Univ Alberta, Dept Math & Stat Sci, Edmonton, AB T6G 2G1, Canada
[2] Stat Canada, Edmonton, AB T5J 0X6, Canada
来源
IEEE ACCESS | 2024年 / 12卷
关键词
Neural networks; Extraterrestrial measurements; Topology; Heavily-tailed distribution; Feedforward neural networks; Toy manufacturing industry; Neural network; universal approximation; Tychonoff space; uniformity; distributions;
D O I
10.1109/ACCESS.2023.3342063
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
A topological neural network (TNN), which takes input data from a Tychonoff topological space instead of the usual finite dimensional space, is introduced. As a consequence, a distributional neural network (DNN) that takes Borel measures as data is also introduced. Combined these new neural networks facilitate things like recognizing long range dependence, heavy tails and other properties in stochastic process paths or like acting on belief states produced by particle filtering or hidden Markov model algorithms. The veracity of the TNN and DNN are then established herein by a strong universal approximation theorem for Tychonoff spaces and its corollary for spaces of measures. These theorems show that neural networks can arbitrarily approximate uniformly continuous functions (with respect to the sup metric) associated with a unique uniformity. We also provide some discussion showing that neural networks on positive-finite measures are a generalization of the recent deep learning notion of deep sets.
引用
收藏
页码:115064 / 115084
页数:21
相关论文
共 21 条
[1]  
Antosik P., 1985, Qatar University Science Bulletin, V5, P41
[2]   On convergence determining and separating classes of functions [J].
Blount, Douglas ;
Kouritzin, Michael A. .
STOCHASTIC PROCESSES AND THEIR APPLICATIONS, 2010, 120 (10) :1898-1907
[3]  
Chen YZ, 2021, Arxiv, DOI arXiv:2010.10079
[4]  
Cybenko G., 1989, Mathematics of Control, Signals, and Systems, V2, P303, DOI 10.1007/BF02551274
[5]  
Del Moral P., 2001, Electronic Journal of Probability, V6, P1
[6]  
Dong C, 2020, Arxiv, DOI arXiv:2011.00484
[7]  
Ethier Stewart N, 2009, Markov processes: characterization and convergence, V282, DOI [10.1002/9780470316658, DOI 10.1002/9780470316658]
[8]   APPROXIMATION CAPABILITIES OF MULTILAYER FEEDFORWARD NETWORKS [J].
HORNIK, K .
NEURAL NETWORKS, 1991, 4 (02) :251-257
[9]  
Khoussi S., 2021, NIST TN 2152), DOI [10.6028/NIST.TN.2152, DOI 10.6028/NIST.TN.2152]
[10]  
Kidger P, 2020, PR MACH LEARN RES, V125