Approximation Theory of Tree Tensor Networks: Tensorized Univariate Functions

被引:2
|
作者
Ali, Mazen [1 ]
Nouy, Anthony [2 ]
机构
[1] Fraunhofer ITWM, D-67663 Kaiserslautern, Germany
[2] Nantes Univ, Cent Nantes, LMJL UMR CNRS 6629, Nantes, France
关键词
Tensor networks; Tensor trains; Matrix product states; Neural networks; Approximation spaces; Besov spaces; Direct (Jackson) and inverse (Bernstein) inequalities; HACKBUSCH CONJECTURE;
D O I
10.1007/s00365-023-09620-w
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
We study the approximation of univariate functions by combining tensorization of functions with tensor trains (TTs)-a commonly used type of tensor networks (TNs). Lebesgue L-P-spaces in one dimension can be identified with tensor product spaces of arbitrary order through tensorization. We use this tensor product structure to define different approximation tools and corresponding approximation spaces of TTs, associated with different measures of complexity. The approximation tools are shown to have (near to) optimal approximation rates for functions with classical Besov smoothness. We then use classical interpolation theory to show that a scale of interpolated smoothness spaces is continuously embedded into the scale of TT approximation spaces and, vice versa, we show that the TT approximation spaces are, in a sense, much larger than smoothness spaces when the depth of the tensor network is not restricted but are embedded into a scale of interpolated smoothness spaces if one restricts the depth. The results of this work can be seen as both an analysis of the approximation spaces of a type of TNs and a study of the expressivity of a particular type of neural networks (NNs)-namely feed-forward sum-product networks with sparse architecture. We point out interesting parallels to recent results on the expressivity of rectifier networks.
引用
收藏
页码:463 / 544
页数:82
相关论文
共 44 条
  • [1] Approximation Theory of Tree Tensor Networks: Tensorized Univariate Functions
    Mazen Ali
    Anthony Nouy
    Constructive Approximation, 2023, 58 : 463 - 544
  • [2] TREE ADAPTIVE APPROXIMATION IN THE HIERARCHICAL TENSOR FORMAT
    Ballani, Jonas
    Grasedyck, Lars
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2014, 36 (04) : A1415 - A1431
  • [3] Committor functions via tensor networks
    Chen, Yian
    Hoskins, Jeremy
    Khoo, Yuehaw
    Lindsey, Michael
    JOURNAL OF COMPUTATIONAL PHYSICS, 2023, 472
  • [4] NEURAL NETWORKS AND THE APPROXIMATION THEORY
    Enachescu, Calin
    PROCEEDINGS OF THE EUROPEAN INTEGRATION: BETWEEN TRADITION AND MODERNITY, VOL 5, 2013, 5 : 1155 - 1164
  • [5] Approximation by ridge functions and neural networks
    Petrushev, PP
    SIAM JOURNAL ON MATHEMATICAL ANALYSIS, 1998, 30 (01) : 155 - 189
  • [6] On the approximation of functions by tanh neural networks
    De Ryck, Tim
    Lanthaler, Samuel
    Mishra, Siddhartha
    NEURAL NETWORKS, 2021, 143 : 732 - 750
  • [7] Machine Learning With Tree Tensor Networks, CP Rank Constraints, and Tensor Dropout
    Chen, Hao
    Barthel, Thomas
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2024, 46 (12) : 7825 - 7832
  • [8] LEARNING HIGH-DIMENSIONAL PROBABILITY DISTRIBUTIONS USING TREE TENSOR NETWORKS
    Grelier, Erwan
    Nouy, Anthony
    Lebrun, Regis
    INTERNATIONAL JOURNAL FOR UNCERTAINTY QUANTIFICATION, 2022, 12 (05) : 47 - 69
  • [9] Constructive approximation of discontinuous functions by neural networks
    Llanas, B.
    Lantaron, S.
    Sainz, F. J.
    NEURAL PROCESSING LETTERS, 2008, 27 (03) : 209 - 226
  • [10] Approximation by neural networks and learning theory
    Maiorov, V
    JOURNAL OF COMPLEXITY, 2006, 22 (01) : 102 - 117