Machine Learning With Tree Tensor Networks, CP Rank Constraints, and Tensor Dropout

被引:1
|
作者
Chen, Hao [1 ]
Barthel, Thomas [2 ,3 ]
机构
[1] Swiss Fed Inst Technol, Dept Phys, CH-8093 Zurich, Switzerland
[2] Duke Univ, Dept Phys, Durham, NC 27708 USA
[3] Duke Univ, Duke Quantum Ctr, Durham, NC 27708 USA
关键词
Machine learning; image classification; tensor networks; tree tensor networks; CP rank; tensor dropout; MATRIX RENORMALIZATION-GROUP; STATES; APPROXIMATION; MODELS;
D O I
10.1109/TPAMI.2024.3396386
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Tensor networks developed in the context of condensed matter physics try to approximate order-N tensors with a reduced number of degrees of freedom that is only polynomial in N and arranged as a network of partially contracted smaller tensors. As we have recently demonstrated in the context of quantum many-body physics, computation costs can be further substantially reduced by imposing constraints on the canonical polyadic (CP) rank of the tensors in such networks. Here, we demonstrate how tree tensor networks (TTN) with CP rank constraints and tensor dropout can be used in machine learning. The approach is found to outperform other tensor-network-based methods in Fashion-MNIST image classification. A low-rank TTN classifier with branching ratio b = 4 reaches a test set accuracy of 90.3% with low computation costs. Consisting of mostly linear elements, tensor network classifiers avoid the vanishing gradient problem of deep neural networks. The CP rank constraints have additional advantages: The number of parameters can be decreased and tuned more freely to control overfitting, improve generalization properties, and reduce computation costs. They allow us to employ trees with large branching ratios, substantially improving the representation power.
引用
收藏
页码:7825 / 7832
页数:8
相关论文
共 50 条
  • [21] Learning and Reasoning with Logic Tensor Networks
    Serafini, Luciano
    Garcez, Artur S. d'Avila
    AI*IA 2016: ADVANCES IN ARTIFICIAL INTELLIGENCE, 2016, 10037 : 334 - 348
  • [22] Time dependent variational principle for tree tensor networks
    Bauernfeind, Daniel
    Aichhorn, Markus
    SCIPOST PHYSICS, 2020, 8 (02):
  • [23] Photonic Tensor Core for Machine Learning: a review
    Peserico, Nicola
    Ma, Xiaoxuan
    Shastri, Bahvin
    Sorger, Volker J.
    EMERGING TOPICS IN ARTIFICIAL INTELLIGENCE (ETAI) 2022, 2022, 12204
  • [24] Quantum Machine Learning Tensor Network States
    Kardashin, Andrey
    Uvarov, Alexey
    Biamonte, Jacob
    FRONTIERS IN PHYSICS, 2021, 8
  • [25] Tree-tensor-network classifiers for machine learning: From quantum inspired to quantum assisted
    Wall, Michael L.
    D'Aguanno, Giuseppe
    PHYSICAL REVIEW A, 2021, 104 (04)
  • [26] Generative machine learning with tensor networks: Benchmarks on near-term quantum computers
    Wall, Michael L.
    Abernathy, Matthew R.
    Quiroz, Gregory
    PHYSICAL REVIEW RESEARCH, 2021, 3 (02):
  • [27] Entanglement-Based Feature Extraction by Tensor Network Machine Learning
    Liu, Yuhan
    Li, Wen-Jun
    Zhang, Xiao
    Lewenstein, Maciej
    Su, Gang
    Ran, Shi-Ju
    FRONTIERS IN APPLIED MATHEMATICS AND STATISTICS, 2021, 7
  • [28] Image Completion Using Low Tensor Tree Rank and Total Variation Minimization
    Liu, Yipeng
    Long, Zhen
    Zhu, Ce
    IEEE TRANSACTIONS ON MULTIMEDIA, 2019, 21 (02) : 338 - 350
  • [29] Machine Learning Model for Flower Image Classification on a Tensor Processing Unit
    Biswas, Anik
    Garbaruk, Julia
    Logofatu, Doina
    INTELLIGENT DISTRIBUTED COMPUTING XV, IDC 2022, 2023, 1089 : 69 - 74
  • [30] ON THE OPTIMAL LINEAR CONTRACTION ORDER OF TREE TENSOR NETWORKS, AND BEYOND
    Stoian, Mihail
    Milbradt, Richard M.
    Mendl, Christian B.
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2024, 46 (05) : B647 - B668