Knowledge-Preserving Incremental Social Event Detection via Heterogeneous GNNs

被引:64
作者
Cao, Yuwei [1 ]
Peng, Hao [2 ]
Wu, Jia [3 ]
Dou, Yingtong [1 ]
Li, Jianxin [2 ]
Yu, Philip S. [1 ]
机构
[1] Univ Illinois, Chicago, IL USA
[2] Beihang Univ, Beijing, Peoples R China
[3] Macquarie Univ, Sydney, NSW, Australia
来源
PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE 2021 (WWW 2021) | 2021年
关键词
Social Event Detection; Graph Neural Networks; Incremental Learning; Contrastive Learning;
D O I
10.1145/3442381.3449834
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Social events provide valuable insights into group social behaviors and public concerns and therefore have many applications in fields such as product recommendation and crisis management. The complexity and streaming nature of social messages make it appealing to address social event detection in an incremental learning setting, where acquiring, preserving, and extending knowledge are major concerns. Most existing methods, including those based on incremental clustering and community detection, learn limited amounts of knowledge as they ignore the rich semantics and structural information contained in social data. Moreover, they cannot memorize previously acquired knowledge. In this paper, we propose a novel Knowledge-Preserving Incremental Heterogeneous Graph Neural Network (KPGNN) for incremental social event detection. To acquire more knowledge, KPGNN models complex social messages into unified social graphs to facilitate data utilization and explores the expressive power of GNNs for knowledge extraction. To continuously adapt to the incoming data, KPGNN adopts contrastive loss terms that cope with a changing number of event classes. It also leverages the inductive learning ability of GNNs to efficiently detect events and extends its knowledge from previously unseen data. To deal with large social streams, KPGNN adopts a mini-batch subgraph sampling strategy for scalable training, and periodically removes obsolete data to maintain a dynamic embedding space. KPGNN requires no feature engineering and has few hyperparameters to tune. Extensive experiment results demonstrate the superiority of KPGNN over various baselines.
引用
收藏
页码:3383 / 3395
页数:13
相关论文
共 46 条
[31]   iCaRL: Incremental Classifier and Representation Learning [J].
Rebuffi, Sylvestre-Alvise ;
Kolesnikov, Alexander ;
Sperl, Georg ;
Lampert, Christoph H. .
30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, :5533-5542
[32]  
Ruder S., 2016, An Overview of Gradient Descent Optimization Algorithms
[33]  
Schroff F, 2015, PROC CVPR IEEE, P815, DOI 10.1109/CVPR.2015.7298682
[34]  
Vaswani A, 2017, ADV NEUR IN, V30
[35]  
Velickovi Petar, 2018, INT C LEARN REPR
[36]  
Velickovic Petar, 2019, ICLR
[37]  
Vinh NX, 2010, J MACH LEARN RES, V11, P2837
[38]   Heterogeneous Graph Attention Network [J].
Wang, Xiao ;
Ji, Houye ;
Shi, Chuan ;
Wang, Bai ;
Cui, Peng ;
Yu, P. ;
Ye, Yanfang .
WEB CONFERENCE 2019: PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE (WWW 2019), 2019, :2022-2032
[39]  
Wang XZ, 2020, PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), P1652
[40]   Embedding Identity and Interest for Social Networks [J].
Xu, Linchuan ;
Wei, Xiaokai ;
Cao, Jiannong ;
Yu, Philip S. .
WWW'17 COMPANION: PROCEEDINGS OF THE 26TH INTERNATIONAL CONFERENCE ON WORLD WIDE WEB, 2017, :859-860