Learning Dynamic Context Graphs for Predicting Social Events

被引:57
作者
Deng, Songgaojun [1 ]
Rangwala, Huzefa [2 ]
Ning, Yue [1 ]
机构
[1] Stevens Inst Technol, Hoboken, NJ 07030 USA
[2] George Mason Univ, Fairfax, VA 22030 USA
来源
KDD'19: PROCEEDINGS OF THE 25TH ACM SIGKDD INTERNATIONAL CONFERENCCE ON KNOWLEDGE DISCOVERY AND DATA MINING | 2019年
关键词
Event Prediction; Dynamic Graph Convolutional Network; Temporal Encoding;
D O I
10.1145/3292500.3330919
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Event forecasting with an aim at modeling contextual information is an important task for applications such as automated analysis generation and resource allocation. Captured contextual information for an event of interest can aid human analysts in understanding the factors associated with that event. However, capturing contextual information within event forecasting is challenging due to several factors: (i) uncertainty of context structure and formulation, (ii) high dimensional features, and (iii) adaptation of features over time. Recently, graph representations have demonstrated success in applications such as traffic forecasting, social influence prediction, and visual question answering systems. In this paper, we study graph representations in modeling social events to identify dynamic properties of event contexts as social indicators. Inspired by graph neural networks, we propose a novel graph convolutional network for predicting future events (e.g., civil unrest movements). We extract and learn graph representations from historical/prior event documents. By employing the hidden word graph features, our proposed model predicts the occurrence of future events and identifies sequences of dynamic graphs as event context. Experimental results on multiple real-world data sets show that the proposed method is competitive against various state-of-the-art methods for social event prediction.
引用
收藏
页码:1007 / 1016
页数:10
相关论文
共 33 条
  • [1] Achrekar H., 2011, IEEE INFOCOM 2011 - IEEE Conference on Computer Communications. Workshops, P702, DOI 10.1109/INFCOMW.2011.5928903
  • [2] [Anonymous], 1996, APPL LINEAR STAT MOD
  • [3] [Anonymous], 2015, ARXIV PREPRINT ARXIV
  • [4] [Anonymous], 2011, J COMPUT SCI-NETH, DOI DOI 10.1016/j.jocs.2010.12.007
  • [5] [Anonymous], 2015, INT C LEARN REPR
  • [6] [Anonymous], 2018, ARXIV181105320
  • [7] A neural probabilistic language model
    Bengio, Y
    Ducharme, R
    Vincent, P
    Jauvin, C
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2003, 3 (06) : 1137 - 1155
  • [8] Boschee Elizabeth, 2015, 2015 ICEWS CODED EVE
  • [9] Cho Kyunghyun, 2014, C EMPIRICAL METHODS, P1724
  • [10] Church K.W., 1990, WORD ASS NORMS MUTUA, V16, P22