Information cascades prediction with attention neural network

被引:21
|
作者
Liu, Yun [1 ]
Bao, Zemin [1 ,2 ]
Zhang, Zhenjiang [1 ]
Tang, Di [3 ]
Xiong, Fei [1 ]
机构
[1] Beijing Jiaotong Univ, Key Lab Commun & Informat Syst, Beijing Municipal Commiss Educ, Beijing 100044, Peoples R China
[2] Coordinat Ctr China, Natl Comp Network Emergency Response Tech Team, Beijing 100029, Peoples R China
[3] Minist Publ Secur, Res Inst 3, Shanghai 200031, Peoples R China
基金
美国国家科学基金会;
关键词
Information diffusion; Deep learning; Attention network; Cascade prediction; POPULARITY; MODEL;
D O I
10.1186/s13673-020-00218-w
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Cascade prediction helps us uncover the basic mechanisms that govern collective human behavior in networks, and it also is very important in extensive other applications, such as viral marketing, online advertising, and recommender systems. However, it is not trivial to make predictions due to the myriad factors that influence a user's decision to reshare content. This paper presents a novel method for predicting the increment size of the information cascade based on an end-to-end neural network. Learning the representation of a cascade in an end-to-end manner circumvents the difficulties inherent to blue the design of hand-crafted features. An attention mechanism, which consists of the intra-attention and inter-gate module, was designed to obtain and fuse the temporal and structural information learned from the observed period of the cascade. The experiments were performed on two real-world scenarios, i.e., predicting the size of retweet cascades on Twitter and predicting the citation of papers in AMiner. Extensive results demonstrated that our method outperformed the state-of-the-art cascade prediction methods, including both feature-based and generative approaches.
引用
收藏
页数:16
相关论文
共 50 条
  • [31] Missing well logs prediction using deep learning integrated neural network with the self-attention mechanism
    Wang, Jun
    Cao, Junxing
    Fu, Jingcheng
    Xu, Hanqing
    ENERGY, 2022, 261
  • [32] Multi-scale attention in attention neural network for single image deblurring☆
    Lee, Ho Sub
    Cho, Sung In
    DISPLAYS, 2024, 85
  • [33] LA-RCNN: Luong attention-recurrent- convolutional neural network for EV charging load prediction
    Mekkaoui, Djamel Eddine
    Midoun, Mohamed Amine
    Shen, Yanming
    APPLIED INTELLIGENCE, 2024, 54 (05) : 4352 - 4369
  • [34] ATTENTION RECURRENT NEURAL NETWORK WITH EARTHWORM OPTIMIZATION ON GROSS DOMESTIC PRODUCT PREDICTION USING MAIN ECONOMIC ACTIVITIES
    Halawani, Hanan T.
    Mohamed, Halima Younis A.
    Alzakari, Sarah A.
    Alruwaitee, Khalil A.
    Alharethi, Thikraa M.
    Yagoub, Rahntalla Y.
    Osman, Alnour
    THERMAL SCIENCE, 2024, 28 (6B): : 5087 - 5095
  • [35] Bayesian Inference of Network Structure From Information Cascades
    Gray, Caitlin
    Mitchell, Lewis
    Roughan, Matthew
    IEEE TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING OVER NETWORKS, 2020, 6 : 371 - 381
  • [36] Spatio-Temporal Attention based Recurrent Neural Network for Next Location Prediction
    Altaf, Basmah
    Yu, Lu
    Zhang, Xiangliang
    2018 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2018, : 937 - 942
  • [37] A Spatiotemporal Graph Neural Network with Graph Adaptive and Attention Mechanisms for Traffic Flow Prediction
    Huo, Yanqiang
    Zhang, Han
    Tian, Yuan
    Wang, Zijian
    Wu, Jianqing
    Yao, Xinpeng
    ELECTRONICS, 2024, 13 (01)
  • [38] Recurrent Neural Network-Based Hourly Prediction of Photovoltaic Power Output Using Meteorological Information
    Lee, Donghun
    Kim, Kwanho
    ENERGIES, 2019, 12 (02)
  • [39] Traffic Flow Prediction Based on Spatial-Temporal Attention Convolutional Neural Network
    Xia Y.
    Liu M.
    Xinan Jiaotong Daxue Xuebao/Journal of Southwest Jiaotong University, 2023, 58 (02): : 340 - 347
  • [40] Compact Convolutional Neural Network with Multi-Headed Attention Mechanism for Seizure Prediction
    Ding, Xin
    Nie, Weiwei
    Liu, Xinyu
    Wang, Xiuying
    Yuan, Qi
    INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 2023, 33 (03)