Inverse Result of Approximation for the Max-Product Neural Network Operators of the Kantorovich Type and Their Saturation Order

被引:10
|
作者
Cantarini, Marco [1 ]
Coroianu, Lucian [2 ]
Costarelli, Danilo [3 ]
Gal, Sorin G. [2 ]
Vinti, Gianluca [3 ]
机构
[1] Marche Polytech Univ, Dept Ind Engn & Math Sci, I-60121 Ancona, Italy
[2] Univ Oradea, Dept Math & Comp Sci, I-410087 Oradea, Italy
[3] Univ Perugia, Dept Math & Comp Sci, I-06123 Perugia, Italy
关键词
sigmoidal functions; ReLU function; neural network operators; saturation result; local inverse theorem; WEIGHTED APPROXIMATION;
D O I
10.3390/math10010063
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
In this paper, we consider the max-product neural network operators of the Kantorovich type based on certain linear combinations of sigmoidal and ReLU activation functions. In general, it is well-known that max-product type operators have applications in problems related to probability and fuzzy theory, involving both real and interval/set valued functions. In particular, here we face inverse approximation problems for the above family of sub-linear operators. We first establish their saturation order for a certain class of functions; i.e., we show that if a continuous and non-decreasing function f can be approximated by a rate of convergence higher than 1/n, as n goes to +& INFIN;, then f must be a constant. Furthermore, we prove a local inverse theorem of approximation; i.e., assuming that f can be approximated with a rate of convergence of 1/n, then f turns out to be a Lipschitz continuous function.
引用
收藏
页数:11
相关论文
共 50 条