Machine Learning With Tree Tensor Networks, CP Rank Constraints, and Tensor Dropout

被引:1
|
作者
Chen, Hao [1 ]
Barthel, Thomas [2 ,3 ]
机构
[1] Swiss Fed Inst Technol, Dept Phys, CH-8093 Zurich, Switzerland
[2] Duke Univ, Dept Phys, Durham, NC 27708 USA
[3] Duke Univ, Duke Quantum Ctr, Durham, NC 27708 USA
关键词
Machine learning; image classification; tensor networks; tree tensor networks; CP rank; tensor dropout; MATRIX RENORMALIZATION-GROUP; STATES; APPROXIMATION; MODELS;
D O I
10.1109/TPAMI.2024.3396386
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Tensor networks developed in the context of condensed matter physics try to approximate order-N tensors with a reduced number of degrees of freedom that is only polynomial in N and arranged as a network of partially contracted smaller tensors. As we have recently demonstrated in the context of quantum many-body physics, computation costs can be further substantially reduced by imposing constraints on the canonical polyadic (CP) rank of the tensors in such networks. Here, we demonstrate how tree tensor networks (TTN) with CP rank constraints and tensor dropout can be used in machine learning. The approach is found to outperform other tensor-network-based methods in Fashion-MNIST image classification. A low-rank TTN classifier with branching ratio b = 4 reaches a test set accuracy of 90.3% with low computation costs. Consisting of mostly linear elements, tensor network classifiers avoid the vanishing gradient problem of deep neural networks. The CP rank constraints have additional advantages: The number of parameters can be decreased and tuned more freely to control overfitting, improve generalization properties, and reduce computation costs. They allow us to employ trees with large branching ratios, substantially improving the representation power.
引用
收藏
页码:7825 / 7832
页数:8
相关论文
共 50 条
  • [41] Mean-field entanglement transitions in random tree tensor networks
    Lopez-Piqueres, Javier
    Ware, Brayden
    Vasseur, Romain
    PHYSICAL REVIEW B, 2020, 102 (06)
  • [42] Diffusion tensor imaging for the differential diagnosis of Parkinsonism by machine learning
    Tsai, Chih-Chien
    Chen, Yao-Liang
    Lu, Chin-Song
    Cheng, Jur-Shan
    Weng, Yi-Hsin
    Lin, Sung-Han
    Wu, Yi-Ming
    Wang, Jiun-Jie
    BIOMEDICAL JOURNAL, 2023, 46 (03)
  • [43] Implementation and comparison of algebraic and machine learning based tensor interpolation methods applied to fiber orientation tensor fields
    Blarr, J.
    Sabiston, T.
    Krauss, C.
    Bauer, J. K.
    Liebig, W. V.
    Inal, K.
    Weidenmann, K. A.
    COMPUTATIONAL MATERIALS SCIENCE, 2023, 228
  • [44] Low-Rank Tensor Completion: A Pseudo-Bayesian Learning Approach
    Chen, Wei
    Song, Nan
    2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2017, : 3325 - 3333
  • [45] Application of quantum-inspired tensor networks to optimize federated learning systems
    Bhatia, Amandeep Singh
    Saggi, Mandeep Kaur
    Kais, Sabre
    QUANTUM MACHINE INTELLIGENCE, 2025, 7 (01)
  • [46] A Low-Rank Tensor Dictionary Learning Method for Hyperspectral Image Denoising
    Gong, Xiao
    Chen, Wei
    Chen, Jie
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2020, 68 : 1168 - 1180
  • [47] Low-rank tensor ring learning for multi-linear regression
    Liu, Jiani
    Zhu, Ce
    Long, Zhen
    Huang, Huyan
    Liu, Yipeng
    PATTERN RECOGNITION, 2021, 113
  • [48] LEARNING EFFICIENT TENSOR REPRESENTATIONS WITH RING-STRUCTURED NETWORKS
    Zhao, Qibin
    Sugiyama, Masashi
    Yuan, Longhao
    Cichocki, Andrzej
    2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 8608 - 8612
  • [49] Federated learning with tensor networks: a quantum AI framework for healthcare
    Bhatia, Amandeep Singh
    Neira, David E. Bernal
    MACHINE LEARNING-SCIENCE AND TECHNOLOGY, 2024, 5 (04):
  • [50] Learning Generative Models for Active Inference Using Tensor Networks
    Wauthier, Samuel T.
    Vanhecke, Bram
    Verbelen, Tim
    Dhoedt, Bart
    ACTIVE INFERENCE, IWAI 2022, 2023, 1721 : 285 - 297