Neulft: A Novel Approach to Nonlinear Canonical Polyadic Decomposition on High-Dimensional Incomplete Tensors

被引:93
作者
Luo, Xin [1 ,2 ]
Wu, Hao [1 ,2 ]
Li, Zechao [3 ]
机构
[1] Southwest Univ, Coll Comp & Informat Sci, Chongqing 400715, Peoples R China
[2] Chinese Acad Sci, Chongqing Inst Green & Intelligent Technol, Chongqing 400714, Peoples R China
[3] Nanjing Univ Sci & Technol, Sch Comp Sci & Engn, Nanjing 210094, Jiangsu, Peoples R China
基金
中国国家自然科学基金;
关键词
Tensors; Data models; Computational modeling; Adaptation models; Training; Artificial neural networks; Neurons; Nonlinear tensor model; Big Data; high-dimensional and incomplete tensor; latent factorization of tensors; hyper-parameter adaptation; LATENT FACTORIZATION; CP DECOMPOSITION; MODEL; OPTIMIZATION; PREDICTION; ALGORITHM; NETWORK; TERM; LINK;
D O I
10.1109/TKDE.2022.3176466
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A High-Dimensional and Incomplete (HDI) tensor is frequently encountered in a big data-related application concerning the complex dynamic interactions among numerous entities. Traditional tensor factorization-based models cannot handle an HDI tensor efficiently, while existing latent factorization of tensors models are all linear models unable to model an HDI tensor's nonlinearity. Motivated by this critical discovery, this paper proposes a Neural Latent Factorization of Tensors model, which provides a novel approach to nonlinear Canonical Polyadic decomposition on an HDI tensor. It is implemented with three-fold interesting ideas: a) adopting the density-oriented modeling principle to build rank-one tensor series with high computational efficiency and affordable storage cost; b) treating each rank-one tensor as a hidden neuron to achieve an efficient neural network structure; and c) developing an adaptive backward propagation (ABP) learning scheme for efficient model training. Experimental results on six HDI tensors from a real system demonstrate that compared with state-of-the-art models, the proposed model achieves significant performance gain in both convergence rate and accuracy. Hence, it is of great significance in performing challenging HDI tensor analysis.
引用
收藏
页码:6148 / 6166
页数:19
相关论文
共 70 条
[1]   Embedding Traffic Network Characteristics Using Tensor for Improved Traffic Prediction [J].
Bhanu, Manish ;
Mendes-Moreira, Joao ;
Chandra, Joydeep .
IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2021, 22 (06) :3359-3371
[2]  
Boyd S. P., 2014, Convex Optimization
[3]   Predicting Citation Count of Scientists as a Link Prediction Problem [J].
Butun, Ertan ;
Kaya, Mehmet .
IEEE TRANSACTIONS ON CYBERNETICS, 2020, 50 (10) :4518-4529
[4]   Improved Quantum-Inspired Evolutionary Algorithm for Large-Size Lane Reservation [J].
Che, Ada ;
Wu, Peng ;
Chu, Feng ;
Zhou, MengChu .
IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS, 2015, 45 (12) :1535-1548
[5]   Quaternion Factorization Machines: A Lightweight Solution to Intricate Feature Interaction Modeling [J].
Chen, Tong ;
Yin, Hongzhi ;
Zhang, Xiangliang ;
Huang, Zi ;
Wang, Yang ;
Wang, Meng .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (08) :4345-4358
[6]   Nonlocal Tensor-Ring Decomposition for Hyperspectral Image Denoising [J].
Chen, Yong ;
He, Wei ;
Yokoya, Naoto ;
Huang, Ting-Zhu ;
Zhao, Xi-Le .
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2020, 58 (02) :1348-1362
[7]   Tensor-Factorized Neural Networks [J].
Chien, Jen-Tzung ;
Bao, Yi-Ting .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (05) :1998-2011
[8]   Deep Unfolding for Topic Models [J].
Chien, Jen-Tzung ;
Lee, Chao-Hsi .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2018, 40 (02) :318-331
[9]  
Zeiler MD, 2012, Arxiv, DOI arXiv:1212.5701
[10]   Wavelet-Based Visual Analysis of Dynamic Networks [J].
Dal Col, Alcebiades ;
Valdivia, Paola ;
Petronetto, Fabiano ;
Dias, Fabio ;
Silva, Claudio T. ;
Gustavo Nonato, L. .
IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, 2018, 24 (08) :2456-2469