Distributed neural tensor completion for network monitoring data recovery

被引:17
作者
Liu, Chunsheng [1 ]
Xie, Kun [2 ]
Wu, Tao [1 ]
Ma, Chunlai [1 ]
Ma, Tao [1 ]
机构
[1] Natl Univ Def Technol, Coll Elect Engn, Hefei 230037, Anhui, Peoples R China
[2] Hunan Univ, Coll Comp Sci & Elect Engn, Changsha 410006, Hunan, Peoples R China
关键词
Network data recovery; Tensor completion; Distributed neural tensor completion; Convolutional neural network;
D O I
10.1016/j.ins.2024.120259
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Network monitoring data is usually incomplete, accurate and fast recovery of missing data is of great significance for practical applications. The tensor -based nonlinear methods have attracted recent attentions with their capability of capturing complex interactions among data for more accurate recovery. However, the training process of existing methods is often time-consuming due to massive data and unreasonable network resource allocation. Thus motivated, we propose a distributed neural tensor completion method, named D -NORM, which simultaneously optimizes both recovery accuracy and time. Specifically, D -NORM adopts two schemes to solve the resulting optimization problem. First, we design a parameter -efficient multi -layer architecture with convolutional neural network to learn nonlinear correlations among data. Second, we reformulate the initial model as an equivalent set function optimization problem under a matroid base constraint. After constructing an approximate supermodular function to substitute the objective set function with provable upper bound, we propose an approximation algorithm based on the two -stage search procedure with theoretical performance guarantee to rationally allocate computing resources and efficiently recover missing data. Extensive experiments conducted on real -world datasets validate the superiority of D -NORM in both efficiency and effectiveness.
引用
收藏
页数:17
相关论文
共 50 条
[1]   Scalable tensor factorizations for incomplete data [J].
Acar, Evrim ;
Dunlavy, Daniel M. ;
Kolda, Tamara G. ;
Morup, Morten .
CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS, 2011, 106 (01) :41-56
[2]  
Al Amin MT, 2017, IEEE INFOCOM SER
[3]  
Barford P, 2002, IMW 2002: PROCEEDINGS OF THE SECOND INTERNET MEASUREMENT WORKSHOP, P71, DOI 10.1145/637201.637210
[4]   Representation Learning: A Review and New Perspectives [J].
Bengio, Yoshua ;
Courville, Aaron ;
Vincent, Pascal .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2013, 35 (08) :1798-1828
[5]   Exact Matrix Completion via Convex Optimization [J].
Candes, Emmanuel J. ;
Recht, Benjamin .
FOUNDATIONS OF COMPUTATIONAL MATHEMATICS, 2009, 9 (06) :717-772
[6]   ANALYSIS OF INDIVIDUAL DIFFERENCES IN MULTIDIMENSIONAL SCALING VIA AN N-WAY GENERALIZATION OF ECKART-YOUNG DECOMPOSITION [J].
CARROLL, JD ;
CHANG, JJ .
PSYCHOMETRIKA, 1970, 35 (03) :283-&
[7]  
Chen HY, 2020, PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, P2449
[8]  
Cisco, 2020, White Paper
[9]   A New Automatic Hyperparameter Recommendation Approach Under Low-Rank Tensor Completion e Framework [J].
Deng, Liping ;
Xiao, Mingqing .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (04) :4038-4050
[10]  
Fang XM, 2015, AAAI CONF ARTIF INTE, P439