共 50 条
- [23] SGKD: A Scalable and Effective Knowledge Distillation Framework for Graph Representation Learning 2022 IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS, ICDMW, 2022, : 666 - 673
- [25] Knowledge Distillation with Graph Neural Networks for Epileptic Seizure Detection MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: APPLIED DATA SCIENCE AND DEMO TRACK, ECML PKDD 2023, PT VI, 2023, 14174 : 547 - 563
- [26] GRAND plus : Scalable Graph Random Neural Networks PROCEEDINGS OF THE ACM WEB CONFERENCE 2022 (WWW'22), 2022, : 3248 - 3258
- [27] When Pansharpening Meets Graph Convolution Network and Knowledge Distillation IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60
- [28] Revisiting Graph based Social Recommendation: A Distillation Enhanced Social Graph Network PROCEEDINGS OF THE ACM WEB CONFERENCE 2022 (WWW'22), 2022, : 2830 - 2838
- [29] Narrow the Input Mismatch in Deep Graph Neural Network Distillation PROCEEDINGS OF THE 29TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2023, 2023, : 3581 - 3592
- [30] GSGNet-S∗: Graph Semantic Guidance Network via Knowledge Distillation for Optical Remote Sensing Image Scene Analysis IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2023, 61 : 1 - 12