Self-Supervised Learning for Label- Efficient Sleep Stage Classification: A Comprehensive Evaluation

被引:14
作者
Eldele, Emadeldeen [1 ]
Ragab, Mohamed [2 ]
Chen, Zhenghua [2 ]
Wu, Min [3 ]
Kwoh, Chee-Keong [1 ]
Li, Xiaoli [1 ,2 ]
机构
[1] Nanyang Technol Univ, Sch Comp Sci & Engn, Singapore 639798, Singapore
[2] ASTAR, Inst Infocomm Res I2R, Ctr Frontier AI Res CFAR, Singapore 138632, Singapore
[3] ASTAR, Inst Infocomm Res I2R, Singapore 138632, Singapore
关键词
Brain modeling; Sleep; Task analysis; Feature extraction; Data models; Self-supervised learning; Classification algorithms; Sleep stage classification; EEG; self-supervised learning; label-efficient learning; RESEARCH RESOURCE; TIME;
D O I
10.1109/TNSRE.2023.3245285
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
The past few years have witnessed a remarkable advance in deep learning for EEG-based sleep stage classification (SSC). However, the success of these models is attributed to possessing a massive amount of labeled data for training, limiting their applicability in real-world scenarios. In such scenarios, sleep labs can generate a massive amount of data, but labeling can be expensive and time-consuming. Recently, the self-supervised learning (SSL) paradigm has emerged as one of the most successful techniques to overcome labels' scarcity. In this paper, we evaluate the efficacy of SSL to boost the performance of existing SSC models in the few-labels regime. We conduct a thorough study on three SSC datasets, and we find that fine-tuning the pretrained SSC models with only 5% of labeled data can achieve competitive performance to the supervised training with full labels. Moreover, self-supervised pretraining helps SSC models to be more robust to data imbalance and domain shift problems.
引用
收藏
页码:1333 / 1342
页数:10
相关论文
共 41 条
[1]   Uncovering the structure of clinical EEG signals with self-supervised learning [J].
Banville, Hubert ;
Chehab, Omar ;
Hyvarinen, Aapo ;
Engemann, Denis-Alexander ;
Gramfort, Alexandre .
JOURNAL OF NEURAL ENGINEERING, 2021, 18 (04)
[2]   AASM Scoring Manual Updates for 2017 (Version 2.4) [J].
Berry, Richard B. ;
Brooks, Rita ;
Gamaldo, Charlene ;
Harding, Susan M. ;
Lloyd, Robin M. ;
Quan, Stuart F. ;
Troester, Matthew T. ;
Vaughn, Bradley V. .
JOURNAL OF CLINICAL SLEEP MEDICINE, 2017, 13 (05) :665-666
[3]   A Graph-Temporal Fused Dual-Input Convolutional Neural Network for Detecting Sleep Stages from EEG Signals [J].
Cai, Qing ;
Gao, Zhongke ;
An, Jianpeng ;
Gao, Shuang ;
Grebogi, Celso .
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-EXPRESS BRIEFS, 2021, 68 (02) :777-781
[4]  
Chen T, 2020, PR MACH LEARN RES, V119
[5]   Exploring Simple Siamese Representation Learning [J].
Chen, Xinlei ;
He, Kaiming .
2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, :15745-15753
[6]  
Eldele E., 2023, IEEE T EMERG TOPICS, V7, P210
[7]   Self-Supervised Contrastive Representation Learning for Semi-Supervised Time-Series Classification [J].
Eldele, Emadeldeen ;
Ragab, Mohamed ;
Chen, Zhenghua ;
Wu, Min ;
Kwoh, Chee-Keong ;
Li, Xiaoli ;
Guan, Cuntai .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (12) :15604-15618
[8]  
Eldele E, 2021, PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, P2352
[9]   An Attention-Based Deep Learning Approach for Sleep Stage Classification With Single-Channel EEG [J].
Eldele, Emadeldeen ;
Chen, Zhenghua ;
Liu, Chengyu ;
Wu, Min ;
Kwoh, Chee-Keong ;
Li, Xiaoli ;
Guan, Cuntai .
IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING, 2021, 29 :809-818
[10]  
Gidaris S., 2018, INT C LEARN REPR, P1