Node ranking algorithm using Graph Convolutional Networks and mini-batch training

被引:0
|
作者
Li, Wenjun [1 ]
Li, Ting [2 ]
Nikougoftar, Elaheh [3 ]
机构
[1] Suzhou Vocat Inst Ind Technol, Sch Artificial Intelligence, Suzhou 215000, Jiangsu, Peoples R China
[2] Suzhou Muhezi Technol Co Ltd, Suzhou 215000, Jiangsu, Peoples R China
[3] Taali Inst Higher Educ, Dept Comp & Elect, Qom, Iran
关键词
Graph Convolutional Networks; Influential nodes; Complex networks; Mini-batch training;
D O I
10.1016/j.chaos.2024.115388
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
This paper presents a novel algorithm for ranking nodes in graph-structured data using Graph Convolutional Networks (GCNs) combined with mini-batch training. The proposed method integrates local and global structural information, enabling a comprehensive understanding of node importance within complex networks. By employing a multi-layer GCN architecture with residual connections and dropout regularization, our approach captures intricate graph patterns while mitigating common issues such as vanishing gradients and overfitting. The node importance scores are computed using a Multi-Layer Perceptron (MLP), with the entire model trained using Mean Squared Error (MSE) loss optimized via the Adam algorithm. We demonstrate the scalability and effectiveness of our method through extensive experiments on various benchmark datasets, showcasing its superior performance in node ranking tasks compared to existing approaches.
引用
收藏
页数:8
相关论文
共 50 条
  • [1] REVISITING GRAPH CONVOLUTIONAL NETWORKS WITH MINI-BATCH SAMPLING FOR HYPERSPECTRAL IMAGE CLASSIFICATION
    Hong, Danfeng
    Gao, Lianru
    Wu, Xin
    Yao, Jing
    Zhang, Bing
    2021 11TH WORKSHOP ON HYPERSPECTRAL IMAGING AND SIGNAL PROCESSING: EVOLUTION IN REMOTE SENSING (WHISPERS), 2021,
  • [2] Confidence Score based Mini-batch Skipping for CNN Training on Mini-batch Training Environment
    Jo, Joongho
    Park, Jongsun
    2020 17TH INTERNATIONAL SOC DESIGN CONFERENCE (ISOCC 2020), 2020, : 129 - 130
  • [3] Deterministic Mini-batch Sequencing for Training Deep Neural Networks
    Banerjee, Subhankar
    Chakraborty, Shayok
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 6723 - 6731
  • [4] Convergence of the Mini-Batch SIHT Algorithm
    Damadi, Saeed
    Shen, Jinglai
    INTELLIGENT SYSTEMS AND APPLICATIONS, VOL 1, INTELLISYS 2023, 2024, 822 : 223 - 233
  • [5] Efficient Mini-batch Training for Stochastic Optimization
    Li, Muu
    Zhang, Tong
    Chen, Yuqiang
    Smola, Alexander J.
    PROCEEDINGS OF THE 20TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING (KDD'14), 2014, : 661 - 670
  • [6] Batch virtual adversarial training for graph convolutional networks
    Deng, Zhijie
    Dong, Yinpeng
    Zhu, Jun
    AI OPEN, 2023, 4 : 73 - 79
  • [7] ON MINI-BATCH TRAINING WITH VARYING LENGTH TIME SERIES
    Iwana, Brian Kenji
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 4483 - 4487
  • [8] Research on Mini-Batch Affinity Propagation Clustering Algorithm
    Xu, Ziqi
    Lu, Yahui
    Jiang, Yu
    2022 IEEE 9TH INTERNATIONAL CONFERENCE ON DATA SCIENCE AND ADVANCED ANALYTICS (DSAA), 2022, : 86 - 95
  • [9] An Asynchronous Mini-batch Algorithm for Regularized Stochastic Optimization
    Feyzmandavian, Hamid Reza
    Aytekin, Arda
    Johansson, Mikael
    2015 54TH IEEE CONFERENCE ON DECISION AND CONTROL (CDC), 2015, : 1384 - 1389
  • [10] An Asynchronous Mini-Batch Algorithm for Regularized Stochastic Optimization
    Feyzmahdavian, Hamid Reza
    Aytekin, Arda
    Johansson, Mikael
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2016, 61 (12) : 3740 - 3754