Machine Learning With Tree Tensor Networks, CP Rank Constraints, and Tensor Dropout

被引:1
|
作者
Chen, Hao [1 ]
Barthel, Thomas [2 ,3 ]
机构
[1] Swiss Fed Inst Technol, Dept Phys, CH-8093 Zurich, Switzerland
[2] Duke Univ, Dept Phys, Durham, NC 27708 USA
[3] Duke Univ, Duke Quantum Ctr, Durham, NC 27708 USA
关键词
Machine learning; image classification; tensor networks; tree tensor networks; CP rank; tensor dropout; MATRIX RENORMALIZATION-GROUP; STATES; APPROXIMATION; MODELS;
D O I
10.1109/TPAMI.2024.3396386
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Tensor networks developed in the context of condensed matter physics try to approximate order-N tensors with a reduced number of degrees of freedom that is only polynomial in N and arranged as a network of partially contracted smaller tensors. As we have recently demonstrated in the context of quantum many-body physics, computation costs can be further substantially reduced by imposing constraints on the canonical polyadic (CP) rank of the tensors in such networks. Here, we demonstrate how tree tensor networks (TTN) with CP rank constraints and tensor dropout can be used in machine learning. The approach is found to outperform other tensor-network-based methods in Fashion-MNIST image classification. A low-rank TTN classifier with branching ratio b = 4 reaches a test set accuracy of 90.3% with low computation costs. Consisting of mostly linear elements, tensor network classifiers avoid the vanishing gradient problem of deep neural networks. The CP rank constraints have additional advantages: The number of parameters can be decreased and tuned more freely to control overfitting, improve generalization properties, and reduce computation costs. They allow us to employ trees with large branching ratios, substantially improving the representation power.
引用
收藏
页码:7825 / 7832
页数:8
相关论文
共 50 条
  • [31] Approximation Theory of Tree Tensor Networks: Tensorized Univariate Functions
    Ali, Mazen
    Nouy, Anthony
    CONSTRUCTIVE APPROXIMATION, 2023, 58 (02) : 463 - 544
  • [32] Approximation Theory of Tree Tensor Networks: Tensorized Univariate Functions
    Mazen Ali
    Anthony Nouy
    Constructive Approximation, 2023, 58 : 463 - 544
  • [33] Decohering tensor network quantum machine learning models
    Liao, Haoran
    Convy, Ian
    Yang, Zhibo
    Whaley, K. Birgitta
    QUANTUM MACHINE INTELLIGENCE, 2023, 5 (01)
  • [34] Decohering tensor network quantum machine learning models
    Haoran Liao
    Ian Convy
    Zhibo Yang
    K. Birgitta Whaley
    Quantum Machine Intelligence, 2023, 5
  • [35] Dictionary Learning With Low-Rank Coding Coefficients for Tensor Completion
    Jiang, Tai-Xiang
    Zhao, Xi-Le
    Zhang, Hao
    Ng, Michael K.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (02) : 932 - 946
  • [36] From Probabilistic Graphical Models to Generalized Tensor Networks for Supervised Learning
    Glasser, Ivan
    Pancotti, Nicola
    Cirac, J. Ignacio
    IEEE ACCESS, 2020, 8 (08): : 68169 - 68182
  • [37] Learning Tensor Low-Rank Representation for Hyperspectral Anomaly Detection
    Wang, Minghua
    Wang, Qiang
    Hong, Danfeng
    Roy, Swalpa Kumar
    Chanussot, Jocelyn
    IEEE TRANSACTIONS ON CYBERNETICS, 2023, 53 (01) : 679 - 691
  • [38] A MOMENTUM BLOCK-RANDOMIZED STOCHASTIC ALGORITHM FOR LOW-RANK TENSOR CP DECOMPOSITION
    Wang, Qingsong
    Cui, Chunfeng
    Han, Deren
    PACIFIC JOURNAL OF OPTIMIZATION, 2021, 17 (03): : 433 - 452
  • [39] Quantum process tomography with unsupervised learning and tensor networks
    Torlai, Giacomo
    Wood, Christopher J.
    Acharya, Atithi
    Carleo, Giuseppe
    Carrasquilla, Juan
    Aolita, Leandro
    NATURE COMMUNICATIONS, 2023, 14 (01)
  • [40] Tensor tree decomposition as a rank-reduction method for pre-stack interpolation
    Manenti, Rafael
    Sacchi, Mauricio D.
    GEOPHYSICAL PROSPECTING, 2023, 71 (08) : 1404 - 1419