SaaN 2L-GRL: Two-Level Graph Representation Learning Empowered With Subgraph-as-a-Node

被引:0
作者
Park, Jeong-Ha [1 ]
Lim, Bo-Young [1 ]
Lee, Kisung [2 ]
Kwon, Hyuk-Yoon [1 ]
机构
[1] Seoul Natl Univ Sci & Technol, Grad Sch Data Sci, Seoul 01811, South Korea
[2] Louisiana State Univ, Div Comp Sci & Engn, Baton Rouge, LA 70803 USA
基金
新加坡国家研究基金会;
关键词
Graph partitioning; graph representation learning; learning efficiency; representation accuracy; two-level architecture; subgraph-; as-a-node; NEURAL-NETWORKS;
D O I
10.1109/TKDE.2024.3421933
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this study, we propose a novel graph representation learning (GRL) model, called Two-Level GRL with Subgraph-as-a-Node (SaaN 2L-GRL in short), that partitions input graphs into smaller subgraphs for effective and scalable GRL in two levels: 1) local GRL and 2) global GRL. To realize the two-level GRL in an efficient manner, we propose an abstracted graph, called Subgraph-as-a-Node Graph (SaaN in short), to effectively maintain the high-level graph topology while significantly reducing the size of the graph. By applying the SaaN graph to both local and global GRL, SaaN 2L-GRL can effectively preserve the overall structure of the entire graph while precisely representing the nodes within each subgraph. Through time complexity analysis, we confirm that SaaN 2L-GRL significantly reduces the learning time of existing GRL models by using the SaaN graph for global GRL, instead of using the original graph, and processing local GRL on subgraphs in parallel. Our extensive experiments show that SaaN 2L-GRL outperforms existing GRL models in both accuracy and efficiency. In addition, we show the effectiveness of SaaN 2L-GRL using diverse kinds of graph partitioning methods, including five community detection algorithms and representative edge- and vertex-cut algorithms.
引用
收藏
页码:9205 / 9219
页数:15
相关论文
共 1 条
  • [1] Two -Level Graph Representation Learning with Community -as -a -Node Graphs
    Park, Jeong-Ha
    Lee, Kisung
    Kwon, Hyuk-Yoon
    23RD IEEE INTERNATIONAL CONFERENCE ON DATA MINING, ICDM 2023, 2023, : 1259 - 1264