Scalable Graph Neural Networks with Deep Graph Library

被引:14
作者
Zheng, Da [1 ]
Wang, Minjie [2 ]
Gan, Quan [2 ]
Song, Xiang [2 ]
Zhang, Zheng [2 ]
Karypis, Geroge [1 ]
机构
[1] AWS AI, Palo Alto, CA 94303 USA
[2] AWS Shanghai AI Lab, Shanghai, Peoples R China
来源
WSDM '21: PROCEEDINGS OF THE 14TH ACM INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING | 2021年
关键词
graph neural networks; Deep Graph Library; scalability;
D O I
10.1145/3437963.3441663
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Learning from graph and relational data plays a major role in many applications including social network analysis, marketing, e-commerce, information retrieval, knowledge modeling, medical and biological sciences, engineering, and others. Recently, Graph Neural Networks (GNNs) have emerged as a promising new learning framework capable of bringing the power of deep representation learning to graph and relational data. This ever-growing body of research has shown that GNNs achieve state-of-the-art performance for problems such as link prediction, fraud detection, target-ligand binding activity prediction, knowledge-graph completion, and product recommendations. In practice, many of the real-world graphs are very large. It is urgent to have scalable solutions to train GNN on large graphs efficiently. The objective of this tutorial is twofold. First, it will provide an overview of the theory behind GNNs, discuss the types of problems that GNNs are well suited for, and introduce some of the most widely used GNN model architectures and problems/applications that are designed to solve. Second, it will introduce the Deep Graph Library (DGL), a scalable GNN framework that simplifies the development of efficient GNN-based training and inference programs at a large scale. To make things concrete, the tutorial will cover state-of-the-art training methods to scale GNN to large graphs and provide hands-on sessions to show how to use DGL to perform scalable training in different settings (multi-GPU training and distributed training). This hands-on part will start with basic graph applications (e.g., node classification and link prediction) to set up the context and move on to train GNNs on large graphs. It will provide tutorials to demonstrate how to apply the techniques in DGL to train GNNs for real-world applications.
引用
收藏
页码:1141 / 1142
页数:2
相关论文
共 6 条
[1]  
Chen JF, 2018, PR MACH LEARN RES, V80
[2]  
Hamilton WL, 2017, ADV NEUR IN, V30
[3]   Heterogeneous Graph Neural Networks for Malicious Account Detection [J].
Liu, Ziqi ;
Chen, Chaochao ;
Yang, Xinxing ;
Zhou, Jun ;
Li, Xiaolong ;
Song, Le .
CIKM'18: PROCEEDINGS OF THE 27TH ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, 2018, :2077-2085
[4]  
Wang MX, 2019, 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019), P803
[5]   Graph Convolutional Neural Networks for Web-Scale Recommender Systems [J].
Ying, Rex ;
He, Ruining ;
Chen, Kaifeng ;
Eksombatchai, Pong ;
Hamilton, William L. ;
Leskovec, Jure .
KDD'18: PROCEEDINGS OF THE 24TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2018, :974-983
[6]  
Zitnik Marinka, 2020, ARXIV PREPRINT ARXIV