Efficient Training of Graph Neural Networks on Large Graphs

被引:0
作者
Shen, Yanyan [1 ]
Chen, Lei [2 ,3 ]
Fang, Jingzhi [2 ]
Zhang, Xin [2 ]
Gao, Shihong [2 ]
Yin, Hongbo [2 ,3 ]
机构
[1] Shanghai Jiao Tong Univ, Shanghai, Peoples R China
[2] HKUST, Hong Kong, Peoples R China
[3] HKUST GZ, Guangzhou, Peoples R China
来源
PROCEEDINGS OF THE VLDB ENDOWMENT | 2024年 / 17卷 / 12期
基金
美国国家科学基金会;
关键词
D O I
10.14778/3685800.3685844
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Graph Neural Networks (GNNs) have gained significant popularity for learning representations of graph-structured data. Mainstream GNNs employ the message passing scheme that iteratively propagates information between connected nodes through edges. However, this scheme incurs high training costs, hindering applicability of GNNs on large graphs. Recently, the database community has extensively researched effective solutions to facilitate efficient GNN training on massive graphs. In this tutorial, we vide a comprehensive overview of the GNN training process based on the graph data lifecycle, covering graph preprocessing, batch generation, data transfer, and model training stages. We discuss recent data management efforts aiming at accelerating individual stages or improving the overall training efficiency. Recognizing distinct training issues associated with static and dynamic graphs, we first focus on efficient GNN training on static graphs, followed by an exploration of training GNNs on dynamic graphs. Finally, we suggest some potential research directions in this area. believe this tutorial is valuable for researchers and practitioners to understand the bottleneck of GNN training and the advanced data management techniques to accelerate the training of different GNNs on massive graphs in diverse hardware settings.
引用
收藏
页码:4237 / 4240
页数:4
相关论文
共 50 条
  • [31] PT-KGNN: A framework for pre-training biomedical knowledge graphs with graph neural networks
    Wang Z.
    Wei Z.
    [J]. Computers in Biology and Medicine, 2024, 178
  • [32] Efficient Scaling of Dynamic Graph Neural Networks
    Chakaravarthy, Venkatesan T.
    Pandian, Shivmaran S.
    Raje, Saurabh
    Sabharwal, Yogish
    Suzumura, Toyotaro
    Ubaru, Shashanka
    [J]. SC21: INTERNATIONAL CONFERENCE FOR HIGH PERFORMANCE COMPUTING, NETWORKING, STORAGE AND ANALYSIS, 2021,
  • [33] Distributed Hybrid CPU and GPU training for Graph Neural Networks on Billion-Scale Heterogeneous Graphs
    Zheng, Da
    Song, Xiang
    Yang, Chengru
    LaSalle, Dominique
    Karypis, George
    [J]. PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 4582 - 4591
  • [34] TinyGNN: Learning Efficient Graph Neural Networks
    Yan, Bencheng
    Wang, Chaokun
    Guo, Gaoyang
    Lou, Yunkai
    [J]. KDD '20: PROCEEDINGS OF THE 26TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2020, : 1848 - 1856
  • [35] Efficient training of backpropagation neural networks
    Otair, Mohammed A.
    Salameh, Walid A.
    [J]. NEURAL NETWORK WORLD, 2006, 16 (04) : 291 - 311
  • [36] An efficient segmented quantization for graph neural networks
    Yue Dai
    Xulong Tang
    Youtao Zhang
    [J]. CCF Transactions on High Performance Computing, 2022, 4 : 461 - 473
  • [37] Fast and Efficient and Training of Neural Networks
    Yu, Hao
    Wilamowski
    [J]. 3RD INTERNATIONAL CONFERENCE ON HUMAN SYSTEM INTERACTION, 2010, : 175 - 181
  • [38] Efficient and Reliable Training of Neural Networks
    Yu, Hao
    Wilamowski, Bogdan M.
    [J]. HSI: 2009 2ND CONFERENCE ON HUMAN SYSTEM INTERACTIONS, 2009, : 106 - 112
  • [39] Graph neural networks at the Large Hadron Collider
    Gage DeZoort
    Peter W. Battaglia
    Catherine Biscarat
    Jean-Roch Vlimant
    [J]. Nature Reviews Physics, 2023, 5 : 281 - 303
  • [40] Graph neural networks at the Large Hadron Collider
    DeZoort, Gage
    Battaglia, Peter W.
    Biscarat, Catherine
    Vlimant, Jean-Roch
    [J]. NATURE REVIEWS PHYSICS, 2023, 5 (05) : 281 - 303