Machine Learning With Tree Tensor Networks, CP Rank Constraints, and Tensor Dropout

被引:1
|
作者
Chen, Hao [1 ]
Barthel, Thomas [2 ,3 ]
机构
[1] Swiss Fed Inst Technol, Dept Phys, CH-8093 Zurich, Switzerland
[2] Duke Univ, Dept Phys, Durham, NC 27708 USA
[3] Duke Univ, Duke Quantum Ctr, Durham, NC 27708 USA
关键词
Machine learning; image classification; tensor networks; tree tensor networks; CP rank; tensor dropout; MATRIX RENORMALIZATION-GROUP; STATES; APPROXIMATION; MODELS;
D O I
10.1109/TPAMI.2024.3396386
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Tensor networks developed in the context of condensed matter physics try to approximate order-N tensors with a reduced number of degrees of freedom that is only polynomial in N and arranged as a network of partially contracted smaller tensors. As we have recently demonstrated in the context of quantum many-body physics, computation costs can be further substantially reduced by imposing constraints on the canonical polyadic (CP) rank of the tensors in such networks. Here, we demonstrate how tree tensor networks (TTN) with CP rank constraints and tensor dropout can be used in machine learning. The approach is found to outperform other tensor-network-based methods in Fashion-MNIST image classification. A low-rank TTN classifier with branching ratio b = 4 reaches a test set accuracy of 90.3% with low computation costs. Consisting of mostly linear elements, tensor network classifiers avoid the vanishing gradient problem of deep neural networks. The CP rank constraints have additional advantages: The number of parameters can be decreased and tuned more freely to control overfitting, improve generalization properties, and reduce computation costs. They allow us to employ trees with large branching ratios, substantially improving the representation power.
引用
收藏
页码:7825 / 7832
页数:8
相关论文
共 50 条
  • [11] Entanglement bipartitioning and tree tensor networks
    Okunishi, Kouichi
    Ueda, Hiroshi
    Nishino, Tomotoshi
    PROGRESS OF THEORETICAL AND EXPERIMENTAL PHYSICS, 2023, 2023 (02):
  • [12] A Survey on Tensor Techniques and Applications in Machine Learning
    Ji, Yuwang
    Wang, Qiang
    Li, Xuan
    Liu, Jie
    IEEE ACCESS, 2019, 7 : 162950 - 162990
  • [13] Active Learning of Tree Tensor Networks using Optimal Least Squares
    Haberstich, Cecile
    Nouy, A.
    Perrin, G.
    SIAM-ASA JOURNAL ON UNCERTAINTY QUANTIFICATION, 2023, 11 (03) : 848 - 876
  • [14] KERNEL LEARNING WITH TENSOR NETWORKS
    Konstantinidis, Kriton
    Li, Shengxi
    Mandic, Danilo P.
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 2920 - 2924
  • [15] Iterative hard thresholding for low CP-rank tensor models
    Grotheer, R.
    Li, S.
    Ma, A.
    Needell, D.
    Qin, J.
    LINEAR & MULTILINEAR ALGEBRA, 2022, 70 (22) : 7452 - 7468
  • [16] Automatic structural optimization of tree tensor networks
    Hikihara, Toshiya
    Ueda, Hiroshi
    Okunishi, Kouichi
    Harada, Kenji
    Nishino, Tomotoshi
    PHYSICAL REVIEW RESEARCH, 2023, 5 (01):
  • [17] LEARNING HIGH-DIMENSIONAL PROBABILITY DISTRIBUTIONS USING TREE TENSOR NETWORKS
    Grelier, Erwan
    Nouy, Anthony
    Lebrun, Regis
    INTERNATIONAL JOURNAL FOR UNCERTAINTY QUANTIFICATION, 2022, 12 (05) : 47 - 69
  • [18] Learning tensor networks with tensor cross interpolation: New algorithms and libraries
    Fernandez, Yuriel Nunez
    Ritter, Marc K.
    Jeannin, Matthieu
    Li, Jheng-Wei
    Kloss, Thomas
    Louvet, Thibaud
    Terasaki, Satoshi
    Parcollet, Olivier
    von Delft, Jan
    Shinaoka, Hiroshi
    Waintal, Xavier
    SCIPOST PHYSICS, 2025, 18 (03):
  • [19] Scalable and Sound Low-Rank Tensor Learning
    Cheng, Hao
    Yu, Yaoliang
    Zhang, Xinhua
    Xing, Eric
    Schuurmans, Dale
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 51, 2016, 51 : 1114 - 1123
  • [20] Stable tensor neural networks for efficient deep learning
    Newman, Elizabeth
    Horesh, Lior
    Avron, Haim
    Kilmer, Misha E.
    FRONTIERS IN BIG DATA, 2024, 7