Longitudinal self-supervised learning

被引:36
|
作者
Zhao, Qingyu [1 ]
Liu, Zixuan [2 ]
Adeli, Ehsan [1 ,3 ]
Pohl, Kilian M. [1 ,4 ]
机构
[1] Stanford Univ, Dept Psychiat & Behav Sci, Stanford, CA 94305 USA
[2] Stanford Univ, Dept Elect Engn, Stanford, CA 94305 USA
[3] Stanford Univ, Dept Comp Sci, Stanford, CA 94305 USA
[4] SRI Int, Ctr Biomed Sci, Menlo Pk, CA 95025 USA
关键词
Longitudinal neuroimaging; Self-supervised learning; Factor disentanglement; Brain age; BRAIN;
D O I
10.1016/j.media.2021.102051
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Machine learning analysis of longitudinal neuroimaging data is typically based on supervised learning, which requires large number of ground-truth labels to be informative. As ground-truth labels are often missing or expensive to obtain in neuroscience, we avoid them in our analysis by combing factor disentanglement with self-supervised learning to identify changes and consistencies across the multiple MRIs acquired of each individual over time. Specifically, we propose a new definition of disentanglement by formulating a multivariate mapping between factors (e.g., brain age ) associated with an MRI and a latent image representation. Then, factors that evolve across acquisitions of longitudinal sequences are disentangled from that mapping by self-supervised learning in such a way that changes in a single factor induce change along one direction in the representation space. We implement this model, named Longitudinal Self-Supervised Learning (LSSL), via a standard autoencoding structure with a cosine loss to disentangle brain age from the image representation. We apply LSSL to two longitudinal neuroimaging studies to highlight its strength in extracting the brain-age information from MRI and revealing informative characteristics associated with neurodegenerative and neuropsychological disorders. Moreover, the representations learned by LSSL facilitate supervised classification by recording faster convergence and higher (or similar) prediction accuracy compared to several other representation learning techniques. (c) 2021 Elsevier B.V. All rights reserved.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] Self-supervised learning of neighborhood embedding for longitudinal MRI
    Ouyang, Jiahong
    Zhao, Qingyu
    Adeli, Ehsan
    Zaharchuk, Greg
    Pohl, Kilian M.
    MEDICAL IMAGE ANALYSIS, 2022, 82
  • [2] Self-supervised Longitudinal Neighbourhood Embedding
    Ouyang, Jiahong
    Zhao, Qingyu
    Adeli, Ehsan
    Sullivan, Edith, V
    Pfefferbaum, Adolf
    Zaharchuk, Greg
    Pohl, Kilian M.
    MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION - MICCAI 2021, PT II, 2021, 12902 : 80 - 89
  • [3] Gated Self-supervised Learning for Improving Supervised Learning
    Fuadi, Erland Hillman
    Ruslim, Aristo Renaldo
    Wardhana, Putu Wahyu Kusuma
    Yudistira, Novanto
    2024 IEEE CONFERENCE ON ARTIFICIAL INTELLIGENCE, CAI 2024, 2024, : 611 - 615
  • [4] Self-Supervised Dialogue Learning
    Wu, Jiawei
    Wang, Xin
    Wang, William Yang
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 3857 - 3867
  • [5] Self-supervised learning model
    Saga, Kazushie
    Sugasaka, Tamami
    Sekiguchi, Minoru
    Fujitsu Scientific and Technical Journal, 1993, 29 (03): : 209 - 216
  • [6] Credal Self-Supervised Learning
    Lienen, Julian
    Huellermeier, Eyke
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [7] Self-Supervised Learning for Recommendation
    Huang, Chao
    Xia, Lianghao
    Wang, Xiang
    He, Xiangnan
    Yin, Dawei
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022, 2022, : 5136 - 5139
  • [8] Quantum self-supervised learning
    Jaderberg, B.
    Anderson, L. W.
    Xie, W.
    Albanie, S.
    Kiffner, M.
    Jaksch, D.
    QUANTUM SCIENCE AND TECHNOLOGY, 2022, 7 (03):
  • [9] Self-Supervised Learning for Electroencephalography
    Rafiei, Mohammad H.
    Gauthier, Lynne V.
    Adeli, Hojjat
    Takabi, Daniel
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (02) : 1457 - 1471
  • [10] Longitudinal Self-supervised Learning Using Neural Ordinary Differential Equation
    Zeghlache, Rachid
    Conze, Pierre-Henri
    Daho, Mostafa El Habib
    Li, Yihao
    Le Boite, Hugo
    Tadayoni, Ramin
    Massin, Pascal
    Cochener, Beatrice
    Brahim, Ikram
    Quellec, Gwenole
    Lamard, Mathieu
    PREDICTIVE INTELLIGENCE IN MEDICINE, PRIME 2023, 2023, 14277 : 1 - 13