Survey on Large-scale Graph Neural Network Systems

被引:0
作者
Zhao G. [1 ]
Wang Q.-G. [1 ]
Yao F. [1 ]
Zhang Y.-F. [1 ]
Yu G. [1 ]
机构
[1] School of Computer Science and Engineering, Northeastern University, Shenyang
来源
Ruan Jian Xue Bao/Journal of Software | 2022年 / 33卷 / 01期
基金
中国国家自然科学基金;
关键词
Back propagation; Deep learning; Distributed system; Graph neural network (GNN); Large-scale graph data;
D O I
10.13328/j.cnki.jos.006311
中图分类号
学科分类号
摘要
Graph neural network (GNN) is used to process graph structure data based on deep learning techniques. It combines graph propagation operations with deep learning algorithms to fully utilize graph structure information and vertex features in the learning process. GNNs have been widely used in a range of applications, such as node classification, graph classification, and link prediction, showing promised effectiveness and interpretability. However, the existing deep learning frameworks (such as TensorFlow and PyTorch) do not provide efficient storage support and message passing support for GNN's training, which limits its usage on large-scale graph data. At present, a number of large-scale GNN systems have been designed by considering the data characteristics of graph structure and the computational characteristics of GNNs. This study first briefly reviews the GNNs, and summarizes the challenges that need to be faced in designing GNN systems. Then, the existing work on GNN training systems is reviewed, and these systems are analyzed from multiple aspects such as system architecture, programming model, message passing optimization, graph partitioning strategy and communication optimization. Finally, several open source GNN systems are chosen for experimental evaluation to compare these systems in terms of accuracy, efficiency, and scalability. © Copyright 2022, Institute of Software, the Chinese Academy of Sciences. All rights reserved.
引用
收藏
页码:150 / 170
页数:20
相关论文
共 57 条
  • [1] Redmon J, Divvala S, Girshick R, Farhadi A., You only look once: Unified, real-time object detection, Proc. of the 2016 IEEE Conf. on Computer Vision and Pattern Recognition, pp. 779-788, (2016)
  • [2] Ren SQ, He KM, Girshick R, Sun J., Faster R-CNN: Towards real-time object detection with region proposal networks, IEEE Trans. on Pattern Analysis and Machine Intelligence, 39, 6, pp. 1137-1149, (2017)
  • [3] Luong MT, Pham H, Manning CD., Effective approaches to attention-based neural machine translation, Proc. of the 2015 Conf. on Empirical Methods in Natural Language Processing, pp. 1412-1421, (2015)
  • [4] Wu YH, Schuster M, Chen ZF, Et al., Google's neural machine translation system: Bridging the gap between human and machine translation, (2016)
  • [5] Hinton G, Deng L, Yu D, Dahl GE, Mohamed AR, Jaitly N, Senior A, Vanhoucke V, Nguyen P, Sainath TN, Kingsbury B., Deep neural networks for acoustic modeling in speech recognition: The shared views of four research groups, IEEE Signal Processing Magazine, 29, 6, pp. 82-97, (2012)
  • [6] Sanchez-Gonzalez A, Heess N, Springenberg JT, Merel J, Riedmiller M, Hadsell R, Battaglia P., Graph networks as learnable physics engines for inference and control, Proc. of the 35th Int'l Conf. on Machine Learning, pp. 4470-4479, (2018)
  • [7] Battaglia PW, Pascanu R, Lai M, Rezende DJ, Kavukcuoglu K., Interaction networks for learning about objects, relations and physics, Proc. of the 30th Int'l Conf. on Neural Information Processing Systems, pp. 4509-4517, (2016)
  • [8] Hamilton WL, Ying R, Leskovec J., Representation learning on graphs: Methods and applications, IEEE Data Engineering Bulletin, 40, 1, pp. 52-74, (2017)
  • [9] Brandes U, Gaertler M, Wagner D., Experiments on graph clustering algorithms, Proc. of the 11th European Symp. on Algorithms, pp. 568-579, (2003)
  • [10] Fout A, Byrd J, Shariat B, Ben-Hur A., Protein interface prediction using graph convolutional networks, Proc. of the 31st Int'l Conf. on Neural Information Processing Systems, pp. 6533-6542, (2017)