DDHH: A Decentralized Deep Learning Framework for Large-scale Heterogeneous Networks

被引:3
|
作者
Imran, Mubashir [1 ]
Yin, Hongzhi [1 ]
Chen, Tong [1 ]
Huang, Zi [1 ]
Zhang, Xiangliang [2 ]
Zheng, Kai [3 ]
机构
[1] Univ Queensland, Brisbane, Qld, Australia
[2] King Abdullah Univ Sci & Technol, Jeddah, Saudi Arabia
[3] Univ Elect Sci & Technol China, Chengdu, Peoples R China
关键词
D O I
10.1109/ICDE51399.2021.00196
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Learning vector representations (i.e., embeddings) of nodes for graph-structured information network has attracted vast interest from both industry and academia. Most real-world networks exhibit a complex and heterogeneous format, enclosing high-order relationships and rich semantic information among nodes. However, existing heterogeneous network embedding (HNE) frameworks are commonly designed in a centralized fashion, i.e., all the data storage and learning process take place on a single machine. Hence, those HNE methods show severe performance bottlenecks when handling large-scale networks due to high consumption on memory, storage, and running time. In light of this, to cope with large-scale HNE tasks with strong efficiency and effectiveness guarantee, we propose Decentralized Deep Heterogeneous Hypergraph (DDHH) embedding framework in this paper. In DDHH, we innovatively formulate a large heterogeneous network as a hypergraph, where its hyperedges can connect a set of semantically similar nodes. Our framework then intelligently partitions the heterogeneous network using the identified hyperedges. Then, each resulted subnetwork is assigned to a distributed worker, which employs the deep information maximization theorem to locally learn node embeddings from the partition received. We further devise a novel embedding alignment scheme to precisely project independently learned node embeddings from all subnetworks onto a public vector space, thus allowing for downstream tasks. As shown from our experimental results, DDHH significantly improves the efficiency and accuracy of existing HNE models, and can easily scale up to large-scale heterogeneous networks.
引用
收藏
页码:2033 / 2038
页数:6
相关论文
共 50 条
  • [1] DeHIN: A Decentralized Framework for Embedding Large-Scale Heterogeneous Information Networks
    Imran, Mubashir
    Yin, Hongzhi
    Chen, Tong
    Huang, Zi
    Zheng, Kai
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (04) : 3645 - 3657
  • [2] Decentralized Embedding Framework for Large-Scale Networks
    Imran, Mubashir
    Yin, Hongzhi
    Chen, Tong
    Shao, Yingxia
    Zhang, Xiangliang
    Zhou, Xiaofang
    DATABASE SYSTEMS FOR ADVANCED APPLICATIONS (DASFAA 2020), PT III, 2020, 12114 : 425 - 441
  • [3] A General Embedding Framework for Heterogeneous Information Learning in Large-Scale Networks
    Huang, Xiao
    Li, Jundong
    Zou, Na
    Hu, Xia
    ACM TRANSACTIONS ON KNOWLEDGE DISCOVERY FROM DATA, 2018, 12 (06)
  • [4] Efficient Large-scale Deep Learning Framework for Heterogeneous Multi-GPU Cluster
    Kim, Youngrang
    Choi, Hyeonseong
    Lee, Jaehwan
    Kim, Jik-Soo
    Jei, Hyunseung
    Roh, Hongchan
    2019 IEEE 4TH INTERNATIONAL WORKSHOPS ON FOUNDATIONS AND APPLICATIONS OF SELF* SYSTEMS (FAS*W 2019), 2019, : 176 - 181
  • [5] A flexible aggregation framework on large-scale heterogeneous information networks
    Yin, Dan
    Gao, Hong
    JOURNAL OF INFORMATION SCIENCE, 2017, 43 (02) : 186 - 203
  • [6] A large-scale evaluation framework for EEG deep learning architectures
    Heilmeyer, Felix A.
    Schirrmeister, Robin T.
    Fiederer, Lukas D. J.
    Voelker, Martin
    Behncke, Joos
    Ball, Tonio
    2018 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2018, : 1039 - 1045
  • [7] Decentralized Ranking in Large-Scale Overlay Networks
    Montresor, Alberto
    Jelasity, Mark
    Babaoglu, Ozalp
    SASOW 2008: SECOND IEEE INTERNATIONAL CONFERENCE ON SELF-ADAPTIVE AND SELF-ORGANIZING SYSTEMS WORKSHOPS, PROCEEDINGS, 2008, : 208 - +
  • [8] Autonomous and decentralized optimization of large-scale heterogeneous wireless networks by neural network dynamics
    Hasegawa, Mikio
    Tran, Ha Nguyen
    Miyamoto, Goh
    Murata, Yoshitoshi
    Harada, Hiroshi
    Kato, Shuzo
    IEICE TRANSACTIONS ON COMMUNICATIONS, 2008, E91B (01) : 110 - 118
  • [9] Decentralized Federated Learning With Asynchronous Parameter Sharing for Large-Scale IoT Networks
    Xie, Haihui
    Xia, Minghua
    Wu, Peiran
    Wang, Shuai
    Huang, Kaibin
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (21): : 34123 - 34139
  • [10] Pairwise Learning for Name Disambiguation in Large-Scale Heterogeneous Academic Networks
    Sun, Qingyun
    Peng, Hao
    Li, Jianxin
    Wang, Senzhang
    Dong, Xiangyu
    Zhao, Liangxuan
    Yu, Philip S.
    He, Lifang
    20TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM 2020), 2020, : 511 - 520