Streaming Graph Neural Networks via Generative Replay

被引:17
作者
Wang, Junshan [1 ]
Zhu, Wenhao [2 ]
Song, Guojie [2 ]
Wang, Liang [1 ]
机构
[1] Alibaba Grp, Beijing, Peoples R China
[2] Peking Univ, Key Lab Machine Percept, Minist Educ, Sch AI, Beijing, Peoples R China
来源
PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022 | 2022年
基金
中国国家自然科学基金;
关键词
graph neural networks; continual learning; streaming networks;
D O I
10.1145/3534678.3539336
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Training Graph Neural Networks (GNNs) incrementally is a particularly urgent problem, because real-world graph data usually arrives in a streaming fashion, and inefficiently updating of the models results in out-of-date embeddings, thus degrade its performance in downstream tasks. Traditional incremental learning methods will gradually forget old knowledge when learning new patterns, which is the catastrophic forgetting problem. Although saving and revisiting historical graph data alleviates the problem, the storage limitation in real-world applications reduces the amount of saved data, causing GNN to forget other knowledge. In this paper, we propose a streaming GNN based on generative replay, which can incrementally learn new patterns while maintaining existing knowledge without accessing historical data. Specifically, our model consists of the main model (GNN) and an auxiliary generative model. The generative model based on random walks with restart can learn and generate fake historical samples (i.e., nodes and their neighborhoods), which can be trained with real data to avoid the forgetting problem. Besides, we also design an incremental update algorithm for the generative model to maintain the graph distribution and for GNN to capture the current patterns. Our model is evaluated on different streaming data sets. The node classification results prove that our model can update the model efficiently and achieve comparable performance to model retraining. Code is available at https://github.com/Junshan-Wang/SGNN-GR.
引用
收藏
页码:1878 / 1888
页数:11
相关论文
共 43 条
[1]   Memory Aware Synapses: Learning What (not) to Forget [J].
Aljundi, Rahaf ;
Babiloni, Francesca ;
Elhoseiny, Mohamed ;
Rohrbach, Marcus ;
Tuytelaars, Tinne .
COMPUTER VISION - ECCV 2018, PT III, 2018, 11207 :144-161
[2]  
[Anonymous], 2010, P 16 ACM SIGKDD INT, DOI DOI 10.1145/1835804.1835896
[3]  
[Anonymous], 2019, Evolvegcn: Evolving graph convolutional networks for dynamic graphs
[4]  
[Anonymous], 2018, ARXIV181105932
[5]  
[Anonymous], 2014, Introduction to Probability Models, DOI DOI 10.1016/C2012-0-03564-8
[6]  
Arjovsky M, 2017, PR MACH LEARN RES, V70
[7]  
Bojchevski A, 2018, PR MACH LEARN RES, V80
[8]  
Chen Xu, 2021, IJCAI
[9]  
Cho K., 2014, ARXIV14061078, DOI [10.48550/arXiv.1406.1078, DOI 10.3115/V1/D14-1179]
[10]  
De Cao N., 2018, An implicit generative model for small molecular graphs