Decomposing Temporal High-Order Interactions via Latent ODEs

被引:0
作者
Li, Shibo [1 ]
Kirby, Robert M. [1 ,2 ]
Zhe, Shandian [1 ]
机构
[1] Univ Utah, Sch Comp, Salt Lake City, UT 84112 USA
[2] Univ Utah, Sci Comp & Imaging SCI Inst, Salt Lake City, UT 84112 USA
来源
INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162 | 2022年
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
High-order interactions between multiple objects are common in real-world applications. Although tensor decomposition is a popular framework for high-order interaction analysis and prediction, most methods cannot well exploit the valuable timestamp information in data. The existent methods either discard the timestamps or convert them into discrete steps or use over-simplistic decomposition models. As a result, these methods might not be capable enough of capturing complex, finegrained temporal dynamics or making accurate predictions for long-term interaction results. To overcome these limitations, we propose a novel Temporal High-order Interaction decompoSition model based on Ordinary Differential Equations (THIS-ODE). We model the time-varying interaction result with a latent ODE. To capture the complex temporal dynamics, we use a neural network (NN) to learn the time derivative of the ODE state. We use the representation of the interaction objects to model the initial value of the ODE and to constitute a part of the NN input to compute the state. In this way, the temporal relationships of the participant objects can be estimated and encoded into their representations. For tractable and scalable inference, we use forward sensitivity analysis to efficiently compute the gradient of ODE state, based on which we use integral transform to develop a stochastic mini-batch learning algorithm. We demonstrate the advantage of our approach in simulation and four real-world applications.
引用
收藏
页数:16
相关论文
共 38 条
[1]  
Ahn D., 2021, Machine Learning, P1
[2]   CoSTCo: A Neural Tensor Completion Model for Sparse Tensors [J].
Liu, Hanpeng ;
Li, Yaguang ;
Tsang, Michael ;
Liu, Yan .
KDD'19: PROCEEDINGS OF THE 25TH ACM SIGKDD INTERNATIONAL CONFERENCCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2019, :324-334
[3]  
[Anonymous], 2016, Proc. 30th Int. Conf. Neural Info. Process. Syst. NIPS
[4]  
[Anonymous], 2009, Advances in Neural Information Processing Systems 22
[5]  
Chen Ricky T. Q., 2018, Advances in Neural Information Processing Systems, V31
[6]  
Choi JH, 2014, ADV NEUR IN, V27
[7]  
Chu W., 2009, AISTATS
[8]  
Dormand J.R., 1980, J. Comput. Appl. Math., V6, p19, 26, DOI [10.1145/502800.50280, DOI 10.1145/502800.50280, 10.1016/0771-050X(80)90013-3, DOI 10.1016/0771-050X(80)90013-3]
[9]   Probabilistic Streaming Tensor Decomposition [J].
Du, Yishuai ;
Zheng, Yimin ;
Lee, Kuang-Chih ;
Zhe, Shandian .
2018 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2018, :99-108
[10]  
Fang S., 2021, INT C MACH LEARN, P3133