Pruning graph neural networks by evaluating edge properties

被引:7
|
作者
Wang, Li [1 ,2 ]
Huang, Wei [2 ]
Zhang, Miao [3 ]
Pan, Shirui [4 ]
Chang, Xiaojun [2 ]
Su, Steven Weidong [1 ,2 ]
机构
[1] Shandong First Med Univ & Shandong Acad Med Sci, Coll Artif Intelligence & Big Data Med Sci, 6699 Qingdao Rd, Jinan 250000, Shandong, Peoples R China
[2] Univ Technol Sydney, Fac Engn & IT, 81 Broadway, Ultimo, NSW 2007, Australia
[3] Aalborg Univ, Comp Sci Dept, Fredrik Bajers Vej 7K, DK-9220 Aalborg, Denmark
[4] Griffith Univ, Sch Informat & Commun Technol, Parklands Dr, Southport, Qld 4222, Australia
关键词
Graph neural networks; Network pruning; Model compression;
D O I
10.1016/j.knosys.2022.109847
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The emergence of larger and deeper graph neural networks (GNNs) makes their training and inference increasingly expensive. Existing GNN pruning methods simultaneously prune the graph adjacency matrix and the model weights on a pretrained neural network by directly leveraging the lottery-ticket hypothesis, but the benefits of such methods are mainly via weight pruning, and methods based on saliency metrics struggle to outperform random pruning when pruning only the graph adjacency matrix. This motivates us to use different scoring standards for graph edges and network weights during GNN pruning. Thus, rather than measuring the importance of graph edges based on saliency metrics, we formulate the performance of GNNs mathematically with respect to the properties of their edges, elucidating how the performance drop can be avoided by pruning negative edges and nonbridges. This leads to our simple but effective two-step method for GNN pruning, leveraging the saliency metrics for the network pruning while sparsifying the graph with preservation of the loss performance. Experimental results show the effectiveness and efficiency of the proposed method on both small-scale graph datasets (Cora, Citeseer, and PubMed) and a large-scale dataset (Ogbn-ArXiv), where our method saves up to 98% of floating-point operations per second (FLOPs) on the small graphs and 94% of FLOPs on the large one, with no significant drop in accuracy. (c) 2022 Elsevier B.V. All rights reserved.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] Pruning graph neural networks by evaluating edge properties
    Wang, Li
    Huang, Wei
    Zhang, Miao
    Pan, Shirui
    Chang, Xiaojun
    Su, Steven Weidong
    Knowledge-Based Systems, 2022, 256
  • [2] Comprehensive Graph Gradual Pruning for Sparse Training in Graph Neural Networks
    Liu, Chuang
    Ma, Xueqi
    Zhan, Yibing
    Ding, Liang
    Tao, Dapeng
    Du, Bo
    Hu, Wenbin
    Mandic, Danilo P.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (10) : 14903 - 14917
  • [3] Dynamic hard pruning of Neural Networks at the edge of the internet
    Valerio, Lorenzo
    Nardini, Franco Maria
    Passarella, Andrea
    Perego, Raffaele
    JOURNAL OF NETWORK AND COMPUTER APPLICATIONS, 2022, 200
  • [4] Evaluating explainability for graph neural networks
    Chirag Agarwal
    Owen Queen
    Himabindu Lakkaraju
    Marinka Zitnik
    Scientific Data, 10
  • [5] Evaluating explainability for graph neural networks
    Agarwal, Chirag
    Queen, Owen
    Lakkaraju, Himabindu
    Zitnik, Marinka
    SCIENTIFIC DATA, 2023, 10 (01)
  • [6] Evaluating Extended Pruning on Object Detection Neural Networks
    O'Keeffe, Simon
    Villing, Rudi
    2018 29TH IRISH SIGNALS AND SYSTEMS CONFERENCE (ISSC), 2018,
  • [7] Training Sparse Graph Neural Networks via Pruning and Sprouting
    Ma, Xueqi
    Ma, Xingjun
    Erfani, Sarah
    Bailey, James
    PROCEEDINGS OF THE 2024 SIAM INTERNATIONAL CONFERENCE ON DATA MINING, SDM, 2024, : 136 - 144
  • [8] Exploiting Edge Features for Graph Neural Networks
    Gong, Liyu
    Cheng, Qiang
    2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 9203 - 9211
  • [9] HodgeNet: Graph Neural Networks for Edge Data
    Roddenberry, T. Mitchell
    Segarra, Santiago
    CONFERENCE RECORD OF THE 2019 FIFTY-THIRD ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS, 2019, : 220 - 224
  • [10] EdgeNets: Edge Varying Graph Neural Networks
    Isufi, Elvin
    Gama, Fernando
    Ribeiro, Alejandro
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (11) : 7457 - 7473