GraRep plus plus : Flexible Learning Graph Representations With Weighted Global Structural Information

被引:1
作者
Ouyang, Mengcen [1 ,2 ]
Zhang, Yinglong [2 ]
Xia, Xuewen [2 ]
Xu, Xing [2 ]
机构
[1] Minnan Normal Univ, Sch Comp Sci, Zhangzhou 363000, Fujian, Peoples R China
[2] Minnan Normal Univ, Sch Phys & Informat Engn, Zhangzhou 363000, Fujian, Peoples R China
基金
中国国家自然科学基金;
关键词
Task analysis; Mathematical models; Tuning; Representation learning; Matrix decomposition; Splicing; Predictive models; Graphical models; Graph representation; matrix factorization; feature learning; dimension reduction;
D O I
10.1109/ACCESS.2023.3313411
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The key to vertex embedding is to learn low-dimensional representations of global graph information, and integrating information from multiple steps is an effective strategy. Existing research shows that the transition probability of each step can capture the relationship between different hops, and the graph's global information can be obtained simultaneously. However, much of the current work simply concatenates representations of different hops into a global representation. In other words, they are unclear about the contribution of each k-step (k = 1) structural information in the embedding. With this motivation, we propose a unified framework that focuses on considering the contributions of different steps in global graph representation. It reconsiders the contribution of different steps in the global representation from two perspectives: (i)We flexibly assign different weights to the different steps loss function, (ii) According to different k, we design strategies that all k-step representations are concatenated with different proportions to form a global representation. Based on this, we more effectively integrate global structural information into the learning process. In this paper, our proposed framework achieves competitive performance on vertex classification, link prediction, and visualization tasks on multiple datasets.
引用
收藏
页码:98217 / 98229
页数:13
相关论文
共 39 条
  • [1] Allen C, 2019, PR MACH LEARN RES, V97
  • [2] Allen Carl, 2019, Advances in Neural Information Processing Systems, V32
  • [3] Extracting semantic representations from word co-occurrence statistics: stop-lists, stemming, and SVD
    Bullinaria, John A.
    Levy, Joseph P.
    [J]. BEHAVIOR RESEARCH METHODS, 2012, 44 (03) : 890 - 907
  • [4] Cao S., 2015, P 24 ACM INT C INF K, P891
  • [5] InfiniteWalk: Deep Network Embeddings as Laplacian Embeddings with a Nonlinearity
    Chanpuriya, Sudhanshu
    Musco, Cameron
    [J]. KDD '20: PROCEEDINGS OF THE 26TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2020, : 1325 - 1333
  • [6] Random-walk computation of similarities between nodes of a graph with application to collaborative recommendation
    Fouss, Francois
    Pirotte, Alain
    Renders, Jean-Michel
    Saerens, Marco
    [J]. IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2007, 19 (03) : 355 - 369
  • [7] node2vec: Scalable Feature Learning for Networks
    Grover, Aditya
    Leskovec, Jure
    [J]. KDD'16: PROCEEDINGS OF THE 22ND ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2016, : 855 - 864
  • [8] Gutmann MU, 2012, J MACH LEARN RES, V13, P307
  • [9] Haveliwala T H., 2002, P 11 INT C WORLD WID, P517, DOI [10.1145/511446.511513, DOI 10.1145/511446.511513]
  • [10] A Broader Picture of Random-walk Based Graph Embedding
    Huang, Zexi
    Silva, Arlei
    Singh, Ambuj
    [J]. KDD '21: PROCEEDINGS OF THE 27TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2021, : 685 - 695