Knowledge Distillation with Graph Neural Networks for Epileptic Seizure Detection

被引:4
|
作者
Zheng, Qinyue [1 ]
Venkitaraman, Arun [1 ]
Petravic, Simona [2 ]
Frossard, Pascal [1 ]
机构
[1] Ecole Polytech Fed Lausanne, Signal Proc Lab LTS4, Lausanne, Switzerland
[2] Embark Studios, Stockholm, Sweden
来源
MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: APPLIED DATA SCIENCE AND DEMO TRACK, ECML PKDD 2023, PT VI | 2023年 / 14174卷
关键词
Personalized seizure detection; Graph neural networks; Knowledge distillation;
D O I
10.1007/978-3-031-43427-3_33
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Wearable devices for seizure monitoring detection could significantly improve the quality of life of epileptic patients. However, existing solutions that mostly rely on full electrode set of electroencephalogram (EEG) measurements could be inconvenient for every day use. In this paper, we propose a novel knowledge distillation approach to transfer the knowledge from a sophisticated seizure detector (called the teacher) trained on data from the full set of electrodes to learn new detectors (called the student). They are both providing lightweight implementations and significantly reducing the number of electrodes needed for recording the EEG. We consider the case where the teacher and the student seizure detectors are graph neural networks (GNN), since these architectures actively use the connectivity information. We consider two cases (a) when a single student is learnt for all the patients using pre-selected channels; and (b) when personalized students are learnt for every individual patient, with personalized channel selection using a Gumbel-softmax approach. Our experiments on the publicly available Temple University Hospital EEG Seizure Data Corpus (TUSZ) show that both knowledge-distillation and personalization play significant roles in improving performance of seizure detection, particularly for patients with scarce EEG data. We observe that using as few as two channels, we are able to obtain competitive seizure detection performance. This, in turn, shows the potential of our approach in more realistic scenario of wearable devices for personalized monitoring of seizures, even with few recordings.
引用
收藏
页码:547 / 563
页数:17
相关论文
共 50 条
  • [1] On Representation Knowledge Distillation for Graph Neural Networks
    Joshi, Chaitanya K.
    Liu, Fayao
    Xun, Xu
    Lin, Jie
    Foo, Chuan Sheng
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (04) : 4656 - 4667
  • [2] Graph-Free Knowledge Distillation for Graph Neural Networks
    Deng, Xiang
    Zhang, Zhongfei
    PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 2321 - 2327
  • [3] Epileptic seizure detection using neural fuzzy networks
    Sadati, Nasser
    Mohseni, Hamid R.
    Maghsoudi, Arash
    2006 IEEE INTERNATIONAL CONFERENCE ON FUZZY SYSTEMS, VOLS 1-5, 2006, : 596 - +
  • [4] Epileptic Seizure Detection Using Convolution Neural Networks
    Sukaria, William
    Malasa, James
    Kumar, Shiu
    Kumar, Rahul
    Assaf, Mansour H.
    Groza, Voicu
    Petriu, Emil M.
    Das, Sunil R.
    2022 IEEE INTERNATIONAL SYMPOSIUM ON MEDICAL MEASUREMENTS AND APPLICATIONS (MEMEA 2022), 2022,
  • [5] RELIANT: Fair Knowledge Distillation for Graph Neural Networks
    Dong, Yushun
    Zhang, Binchi
    Yuan, Yiling
    Zou, Na
    Wang, Qi
    Li, Jundong
    PROCEEDINGS OF THE 2023 SIAM INTERNATIONAL CONFERENCE ON DATA MINING, SDM, 2023, : 154 - +
  • [6] Adaptively Denoising Graph Neural Networks for Knowledge Distillation
    Guo, Yuxin
    Yang, Cheng
    Shi, Chuan
    Tu, Ke
    Wu, Zhengwei
    Zhang, Zhiqiang
    Zhou, Jun
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES-RESEARCH TRACK AND DEMO TRACK, PT VIII, ECML PKDD 2024, 2024, 14948 : 253 - 269
  • [7] Online adversarial knowledge distillation for graph neural networks
    Wang, Can
    Wang, Zhe
    Chen, Defang
    Zhou, Sheng
    Feng, Yan
    Chen, Chun
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 237
  • [8] Knowledge Distillation Improves Graph Structure Augmentation for Graph Neural Networks
    Wu, Lirong
    Lin, Haitao
    Huang, Yufei
    Li, Stan Z.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [9] Accelerating Molecular Graph Neural Networks via Knowledge Distillation
    Kelvinius, Filip Ekstrom
    Georgiev, Dimitar
    Toshev, Artur Petrov
    Gasteiger, Johannes
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36, NEURIPS 2023, 2023,
  • [10] Geometric Knowledge Distillation: Topology Compression for Graph Neural Networks
    Yang, Chenxiao
    Wu, Qitian
    Yan, Junchi
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,