TransEHR: Self-Supervised Transformer for Clinical Time Series Data

被引:0
|
作者
Xu, Yanbo [1 ]
Xu, Shangqing [1 ]
Ramprassad, Manav [1 ]
Tumanov, Alexey [1 ]
Zhang, Chao [1 ]
机构
[1] Georgia Inst Technol, Atlanta, GA 30332 USA
基金
美国国家科学基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep neural networks, including the Transformer architecture, have achieved remarkable performance in various time series tasks. However, their effectiveness in handling clinical time series data is hindered by specific challenges: 1) Sparse event sequences collected asynchronously with multivariate time series, and 2) Limited availability of labeled data. To address these challenges, we propose TransEHR1, a self-supervised Transformer model designed to encode multi-sourced asynchronous sequential data, such as structured Electronic Health Records (EHRs), efficiently. We introduce three pretext tasks for pre-training the Transformer model, utilizing large amounts of unlabeled structured EHR data, followed by fine-tuning on downstream prediction tasks using the limited labeled data. Through extensive experiments on three real-world health datasets, we demonstrate that our model achieves state-of-the-art performance on benchmark clinical tasks, including in-hospital mortality classification, phenotyping, and length-of-stay prediction. Our findings highlight the efficacy of TransEHR in effectively addressing the challenges associated with clinical time series data, thus contributing to advancements in healthcare analytics.
引用
收藏
页码:623 / 635
页数:13
相关论文
共 50 条
  • [1] Self-Supervised Transformer for Sparse and Irregularly Sampled Multivariate Clinical Time-Series
    Tipirneni, Sindhu
    Reddy, Chandan K.
    ACM TRANSACTIONS ON KNOWLEDGE DISCOVERY FROM DATA, 2022, 16 (06)
  • [2] Self-Supervised Time Series Classification Based on LSTM and Contrastive Transformer
    ZOU Yuanhao
    ZHANG Yufei
    ZHAO Xiaodong
    Wuhan University Journal of Natural Sciences, 2022, 27 (06) : 521 - 530
  • [3] Self-Supervised Autoregressive Domain Adaptation for Time Series Data
    Ragab, Mohamed
    Eldele, Emadeldeen
    Chen, Zhenghua
    Wu, Min
    Kwoh, Chee-Keong
    Li, Xiaoli
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (01) : 1341 - 1351
  • [4] Self-Supervised Time Series Representation Learning via Cross Reconstruction Transformer
    Zhang, Wenrui
    Yang, Ling
    Geng, Shijia
    Hong, Shenda
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (11) : 16129 - 16138
  • [5] Self-Supervised Time Series Representation Learning via Cross Reconstruction Transformer
    Zhang, Wenrui
    Yang, Ling
    Geng, Shijia
    Hong, Shenda
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (11) : 16129 - 16138
  • [6] Self-supervised Classification of Clinical Multivariate Time Series using Time Series Dynamics
    Yehuda, Yakir
    Freedman, Daniel
    Radinsky, Kira
    PROCEEDINGS OF THE 29TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2023, 2023, : 5416 - 5427
  • [7] Generality-aware self-supervised transformer for multivariate time series anomaly detection
    Cho, Yucheol
    Lee, Jae-Hyeok
    Ham, Gyeongdo
    Jang, Donggon
    Kim, Dae-shik
    APPLIED INTELLIGENCE, 2025, 55 (07)
  • [8] Self-Supervised Pretraining Transformer for Seismic Data Denoising
    Wang, Hongzhou
    Lin, Jun
    Li, Yue
    Dong, Xintong
    Tong, Xunqian
    Lu, Shaoping
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2024, 62 : 1 - 25
  • [9] Self-supervised Video Transformer
    Ranasinghe, Kanchana
    Naseer, Muzammal
    Khan, Salman
    Khan, Fahad Shahbaz
    Ryoo, Michael S.
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2022), 2022, : 2864 - 2874
  • [10] Denoised Labels for Financial Time Series Data via Self-Supervised Learning
    Ma, Yanqing
    Ventre, Carmine
    Polukarov, Maria
    3RD ACM INTERNATIONAL CONFERENCE ON AI IN FINANCE, ICAIF 2022, 2022, : 471 - 479