Approximation Theory of Tree Tensor Networks: Tensorized Univariate Functions

被引:2
|
作者
Ali, Mazen [1 ]
Nouy, Anthony [2 ]
机构
[1] Fraunhofer ITWM, D-67663 Kaiserslautern, Germany
[2] Nantes Univ, Cent Nantes, LMJL UMR CNRS 6629, Nantes, France
关键词
Tensor networks; Tensor trains; Matrix product states; Neural networks; Approximation spaces; Besov spaces; Direct (Jackson) and inverse (Bernstein) inequalities; HACKBUSCH CONJECTURE;
D O I
10.1007/s00365-023-09620-w
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
We study the approximation of univariate functions by combining tensorization of functions with tensor trains (TTs)-a commonly used type of tensor networks (TNs). Lebesgue L-P-spaces in one dimension can be identified with tensor product spaces of arbitrary order through tensorization. We use this tensor product structure to define different approximation tools and corresponding approximation spaces of TTs, associated with different measures of complexity. The approximation tools are shown to have (near to) optimal approximation rates for functions with classical Besov smoothness. We then use classical interpolation theory to show that a scale of interpolated smoothness spaces is continuously embedded into the scale of TT approximation spaces and, vice versa, we show that the TT approximation spaces are, in a sense, much larger than smoothness spaces when the depth of the tensor network is not restricted but are embedded into a scale of interpolated smoothness spaces if one restricts the depth. The results of this work can be seen as both an analysis of the approximation spaces of a type of TNs and a study of the expressivity of a particular type of neural networks (NNs)-namely feed-forward sum-product networks with sparse architecture. We point out interesting parallels to recent results on the expressivity of rectifier networks.
引用
收藏
页码:463 / 544
页数:82
相关论文
共 44 条
  • [21] Projecting basis functions with tensor networks for Gaussian process regression
    Menzen, Clara
    Memmel, Eva
    Batselier, Kim
    Kok, Manon
    IFAC PAPERSONLINE, 2023, 56 (02): : 7288 - 7293
  • [22] On the near optimality of the stochastic approximation of smooth functions by neural networks
    Maiorov, VE
    Meir, R
    ADVANCES IN COMPUTATIONAL MATHEMATICS, 2000, 13 (01) : 79 - 103
  • [23] Efficient Approximation of High-Dimensional Functions With Neural Networks
    Cheridito, Patrick
    Jentzen, Arnulf
    Rossmannek, Florian
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (07) : 3079 - 3093
  • [24] On the near optimality of the stochastic approximation of smooth functions by neural networks
    V.E. Maiorov
    R. Meir
    Advances in Computational Mathematics, 2000, 13 : 79 - 103
  • [25] Wilson loops and area laws in lattice gauge theory tensor networks
    Zohar, Erez
    PHYSICAL REVIEW RESEARCH, 2021, 3 (03):
  • [26] Group field theory and tensor networks: towards a Ryu-Takayanagi formula in full quantum gravity
    Chirco, Goffredo
    Oriti, Daniele
    Zhang, Mingyi
    CLASSICAL AND QUANTUM GRAVITY, 2018, 35 (11)
  • [27] Universal Approximation Using Probabilistic Neural Networks with Sigmoid Activation Functions
    Murugadoss, R.
    Ramakrishnan, M.
    2014 INTERNATIONAL CONFERENCE ON ADVANCES IN ENGINEERING AND TECHNOLOGY RESEARCH (ICAETR), 2014,
  • [28] Efficient Approximation of Deep ReLU Networks for Functions on Low Dimensional Manifolds
    Chen, Minshuo
    Jiang, Haoming
    Liao, Wenjing
    Zhao, Tuo
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [29] Approximation bounds for smooth functions in C(IRd) by neural and mixture networks
    Maiorov, V
    Meir, RS
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 1998, 9 (05): : 969 - 978
  • [30] Deep vs. shallow networks: An approximation theory perspective
    Mhaskar, H. N.
    Poggio, T.
    ANALYSIS AND APPLICATIONS, 2016, 14 (06) : 829 - 848