Self-Supervised Learning for Label- Efficient Sleep Stage Classification: A Comprehensive Evaluation

被引:10
作者
Eldele, Emadeldeen [1 ]
Ragab, Mohamed [2 ]
Chen, Zhenghua [2 ]
Wu, Min [3 ]
Kwoh, Chee-Keong [1 ]
Li, Xiaoli [1 ,2 ]
机构
[1] Nanyang Technol Univ, Sch Comp Sci & Engn, Singapore 639798, Singapore
[2] ASTAR, Inst Infocomm Res I2R, Ctr Frontier AI Res CFAR, Singapore 138632, Singapore
[3] ASTAR, Inst Infocomm Res I2R, Singapore 138632, Singapore
关键词
Brain modeling; Sleep; Task analysis; Feature extraction; Data models; Self-supervised learning; Classification algorithms; Sleep stage classification; EEG; self-supervised learning; label-efficient learning; RESEARCH RESOURCE; TIME;
D O I
10.1109/TNSRE.2023.3245285
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
The past few years have witnessed a remarkable advance in deep learning for EEG-based sleep stage classification (SSC). However, the success of these models is attributed to possessing a massive amount of labeled data for training, limiting their applicability in real-world scenarios. In such scenarios, sleep labs can generate a massive amount of data, but labeling can be expensive and time-consuming. Recently, the self-supervised learning (SSL) paradigm has emerged as one of the most successful techniques to overcome labels' scarcity. In this paper, we evaluate the efficacy of SSL to boost the performance of existing SSC models in the few-labels regime. We conduct a thorough study on three SSC datasets, and we find that fine-tuning the pretrained SSC models with only 5% of labeled data can achieve competitive performance to the supervised training with full labels. Moreover, self-supervised pretraining helps SSC models to be more robust to data imbalance and domain shift problems.
引用
收藏
页码:1333 / 1342
页数:10
相关论文
共 41 条
[21]   ISRUC-Sleep: A comprehensive public dataset for sleep researchers [J].
Khalighi, Sirvan ;
Sousa, Teresa ;
Santos, Jose Moutinho ;
Nunes, Urbano .
COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE, 2016, 124 :180-192
[22]  
Lee H, 2022, PROCEEDINGS OF THE THIRTY-FIRST INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2022, P3868
[23]  
Mohsenvand M, 2020, PR MACH LEARN RES, V136, P238
[24]   How Useful is Self-Supervised Pretraining for Visual Tasks? [J].
Newell, Alejandro ;
Deng, Jia .
2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2020, :7343-7352
[25]   Unsupervised Learning of Visual Representations by Solving Jigsaw Puzzles [J].
Noroozi, Mehdi ;
Favaro, Paolo .
COMPUTER VISION - ECCV 2016, PT VI, 2016, 9910 :69-84
[26]   The future of sleep health: a data-driven revolution in sleep science and medicine [J].
Perez-Pozuelo, Ignacio ;
Zhai, Bing ;
Palotti, Joao ;
Mall, Raghvendra ;
Aupetit, Michael ;
Garcia-Gomez, Juan M. ;
Taheri, Shahrad ;
Guan, Yu ;
Fernandez-Luque, Luis .
NPJ DIGITAL MEDICINE, 2020, 3 (01)
[27]   Towards More Accurate Automatic Sleep Staging via Deep Transfer Learning [J].
Phan, Huy ;
Chen, Oliver Y. ;
Koch, Philipp ;
Lu, Zongqing ;
McLoughlin, Ian ;
Mertins, Alfred ;
De Vos, Maarten .
IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, 2021, 68 (06) :1787-1798
[28]   ENHANCING CONTEXTUAL ENCODING WITH STAGE-CONFUSION AND STAGE-TRANSITION ESTIMATION FOR EEG-BASED SLEEP STAGING [J].
Phyo, Jaeun ;
Ko, Wonjun ;
Jeon, Eunjin ;
Suk, Heung-Il .
2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, :1301-1305
[29]   A Residual Based Attention Model for EEG Based Sleep Staging [J].
Qu, Wei ;
Wang, Zhiyong ;
Hong, Hong ;
Chi, Zheru ;
Feng, David Dagan ;
Grunstein, Ron ;
Gordon, Christopher .
IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2020, 24 (10) :2833-2843
[30]  
Quan SF, 1997, SLEEP, V20, P1077