Approximation Theory of Tree Tensor Networks: Tensorized Univariate Functions

被引:2
|
作者
Ali, Mazen [1 ]
Nouy, Anthony [2 ]
机构
[1] Fraunhofer ITWM, D-67663 Kaiserslautern, Germany
[2] Nantes Univ, Cent Nantes, LMJL UMR CNRS 6629, Nantes, France
关键词
Tensor networks; Tensor trains; Matrix product states; Neural networks; Approximation spaces; Besov spaces; Direct (Jackson) and inverse (Bernstein) inequalities; HACKBUSCH CONJECTURE;
D O I
10.1007/s00365-023-09620-w
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
We study the approximation of univariate functions by combining tensorization of functions with tensor trains (TTs)-a commonly used type of tensor networks (TNs). Lebesgue L-P-spaces in one dimension can be identified with tensor product spaces of arbitrary order through tensorization. We use this tensor product structure to define different approximation tools and corresponding approximation spaces of TTs, associated with different measures of complexity. The approximation tools are shown to have (near to) optimal approximation rates for functions with classical Besov smoothness. We then use classical interpolation theory to show that a scale of interpolated smoothness spaces is continuously embedded into the scale of TT approximation spaces and, vice versa, we show that the TT approximation spaces are, in a sense, much larger than smoothness spaces when the depth of the tensor network is not restricted but are embedded into a scale of interpolated smoothness spaces if one restricts the depth. The results of this work can be seen as both an analysis of the approximation spaces of a type of TNs and a study of the expressivity of a particular type of neural networks (NNs)-namely feed-forward sum-product networks with sparse architecture. We point out interesting parallels to recent results on the expressivity of rectifier networks.
引用
收藏
页码:463 / 544
页数:82
相关论文
共 44 条
  • [41] Simultaneous Approximation of Polynomial Functions and Its Derivatives by Feedforward Artificial Neural Networks with One Hidden Layer
    Uzentsova, N. S.
    Sidorov, S. P.
    IZVESTIYA SARATOVSKOGO UNIVERSITETA NOVAYA SERIYA-MATEMATIKA MEKHANIKA INFORMATIKA, 2013, 13 (02): : 14 - 14
  • [42] Neural networks trained with high-dimensional functions approximation data in high-dimensional space
    Zheng, Jian
    Wang, Jianfeng
    Chen, Yanping
    Chen, Shuping
    Chen, Jingjin
    Zhong, Wenlong
    Wu, Wenling
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2021, 41 (02) : 3739 - 3750
  • [43] Image encryption algorithm for synchronously updating Boolean networks based on matrix semi-tensor product theory
    Wang, Xingyuan
    Gao, Suo
    INFORMATION SCIENCES, 2020, 507 : 16 - 36
  • [44] What Kinds of Functions Do Deep Neural Networks Learn? Insights from Variational Spline Theory\ast
    Parhi, Rahul
    Nowak, Robert D.
    SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2022, 4 (02): : 464 - 489