Self-Supervised Representation Learning on Electronic Health Records with Graph Kernel Infomax

被引:0
|
作者
Yao, Hao-ren [1 ,3 ]
Cao, Nairen [2 ]
Russell, Katina [1 ]
Chang, Der-chen [1 ]
Frieder, Ophir [1 ]
Fineman, Jeremy t. [1 ]
机构
[1] Georgetown Univ, 3700 O St NW, Washington, DC 20057 USA
[2] Boston Coll, 140 Commonwealth Ave, Chestnut Hill, MA 02467 USA
[3] NIH, 10 Ctr Dr, Bethesda, MD 20892 USA
来源
关键词
Graph contrastive learning; patient representation learning; NYSTROM METHOD;
D O I
10.1145/3648695
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Learning Electronic Health Records (EHRs) representation is a preeminent yet under-discovered research topic. It benefits various clinical decision support applications, e.g., medication outcome prediction or patient similarity search. Current approaches focus on task-specific label supervision on vectorized sequential EHR, which is not applicable to large-scale unsupervised scenarios. Recently, contrastive learning has shown great success in self-supervised representation learning problems. However, complex temporality often degrades the performance. We propose Graph Kernel Infomax, a self- supervised graph kernel learning approach on the graphical representation of EHR, to overcome the previous problems. Unlike the state-of-the-art, we do not change the graph structure to construct augmented views. Instead, we use Kernel Subspace Augmentation to embed nodes into two geometrically different manifold views. The entire framework is trained by contrasting nodes and graph representations on those two manifold views through the commonly used contrastive objectives. Empirically, using publicly available benchmark EHR datasets, our approach yields performance on clinical downstream tasks that exceeds the state-of-the-art. Theoretically, the variation in distance metrics naturally creates different views as data augmentation without changing graph structures. Practically, our method is non-ad hoc and confirms superior performance on commonly used graph benchmark datasets.
引用
收藏
页数:28
相关论文
共 50 条
  • [41] Self-Supervised Graph Representation Learning Method Based on Data and Feature Augmentation
    Xu, Yunfeng
    Fan, Hexun
    Computer Engineering and Applications, 2024, 60 (17) : 148 - 157
  • [42] HeGCL: Advance Self-Supervised Learning in Heterogeneous Graph-Level Representation
    Shi, Gen
    Zhu, Yifan
    Liu, Jian K.
    Li, Xuesong
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (10) : 13914 - 13925
  • [43] Self-Supervised Contrastive Molecular Representation Learning with a Chemical Synthesis Knowledge Graph
    Xie, Jiancong
    Wang, Yi
    Rao, Jiahua
    Zheng, Shuangjia
    Yang, Yuedong
    JOURNAL OF CHEMICAL INFORMATION AND MODELING, 2024, 64 (06) : 1945 - 1954
  • [44] CLEAR: Cluster-Enhanced Contrast for Self-Supervised Graph Representation Learning
    Luo, Xiao
    Ju, Wei
    Qu, Meng
    Gu, Yiyang
    Chen, Chong
    Deng, Minghua
    Hua, Xian-Sheng
    Zhang, Ming
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (01) : 899 - 912
  • [45] Self-supervised Graph-level Representation Learning with Local and Global Structure
    Xu, Minghao
    Wang, Hang
    Ni, Bingbing
    Guo, Hongyu
    Tang, Jian
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [46] SimGRL: a simple self-supervised graph representation learning framework via triplets
    Huang, Da
    Lei, Fangyuan
    Zeng, Xi
    COMPLEX & INTELLIGENT SYSTEMS, 2023, 9 (05) : 5049 - 5062
  • [47] Self-Supervised Dynamic Graph Representation Learning via Temporal Subgraph Contrast
    Chen, Ke-Jia
    Liu, Linsong
    Jiang, Linpu
    Chen, Jingqiang
    ACM TRANSACTIONS ON KNOWLEDGE DISCOVERY FROM DATA, 2024, 18 (01)
  • [48] LaundroGraph: Self-Supervised Graph Representation Learning for Anti-Money Laundering
    Cardoso, Mario
    Saleiro, Pedro
    Bizarro, Pedro
    3RD ACM INTERNATIONAL CONFERENCE ON AI IN FINANCE, ICAIF 2022, 2022, : 130 - 138
  • [49] Scalable self-supervised graph representation learning via enhancing and contrasting subgraphs
    Jiao, Yizhu
    Xiong, Yun
    Zhang, Jiawei
    Zhang, Yao
    Zhang, Tianqi
    Zhu, Yangyong
    KNOWLEDGE AND INFORMATION SYSTEMS, 2022, 64 (01) : 235 - 260
  • [50] Self-Supervised Forecasting in Electronic Health Records with Attention-Free Models
    Kumar Y.
    Ilin A.
    Salo H.
    Kulathinal S.
    Leinonen M.K.
    Marttinen P.
    IEEE Transactions on Artificial Intelligence, 2024, 5 (08): : 1 - 17