Deep Collaborative Learning With Application to the Study of Multimodal Brain Development

被引:35
|
作者
Hu, Wenxing [1 ]
Cai, Biao [1 ]
Zhang, Aiying [1 ]
Calhoun, Vince D. [2 ]
Wang, Yu-Ping [1 ]
机构
[1] Tulane Univ, Biomed Engn Dept, New Orleans, LA 70118 USA
[2] Univ New Mexico, Mind Res Network, Albuquerque, NM 87131 USA
关键词
Canonical correlation; deep network; fMRI; functional connectivity; brain development; FUNCTIONAL CONNECTIVITY; FMRI; NETWORK;
D O I
10.1109/TBME.2019.2904301
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Objective: Multi-modal functional magnetic resonance imaging has been widely used for brain research. Conventional data-fusion methods cannot capture complex relationship (e.g., nonlinear predictive relationship) between multiple data. This paper aims to develop a neural network framework to extract phenotype related cross-data relationships and use it to study the brain development. Methods: We propose a novel method, deep collaborative learning (DCL), to address the limitation of existing methods. DCL first uses a deep network to represent original data and then seeks their correlations, while also linking the data representation with phenotypical information. Results: We studied the difference of functional connectivity (FCs) between different age groups and also use FCs as a fingerprint to predict cognitive abilities. Our experiments demonstrated higher accuracy of using DCL over other conventional models when classifying populations of different ages and cognitive scores. Moreover, DCL revealed that brain connections became stronger at adolescence stage. Furthermore, DCL detected strong correlations between default mode network and other networks which were overlooked by linear canonical correlation analysis, demonstrating DCL's ability of detecting nonlinear correlations. Conclusion: The results verified the superiority of DCL over conventional data-fusion methods. In addition, the stronger brain connection demonstrated the importance of adolescence stage for brain development. Significance: DCL can better combine complex correlations between multiple data sets in addition to their fitting to phenotypes, with the potential to overcome the limitations of several current data-fusion models.
引用
收藏
页码:3346 / 3359
页数:14
相关论文
共 50 条
  • [1] A Deep Dynamic Causal Learning Model to Study Changes in Dynamic Effective Connectivity During Brain Development
    Wang, Yingying
    Qiao, Chen
    Qu, Gang
    Calhoun, Vince D.
    Stephen, Julia M.
    Wilson, Tony W.
    Wang, Yu-Ping
    IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, 2024, 71 (12) : 3390 - 3401
  • [2] Multimodal data analysis of epileptic EEG and rs-fMRI via deep learning and edge computing
    Hosseini, Mohammad-Parsa
    Tran, Tuyen X.
    Pompili, Dario
    Elisevich, Kost
    Soltanian-Zadeh, Hamid
    ARTIFICIAL INTELLIGENCE IN MEDICINE, 2020, 104
  • [3] A Multimodal Multilevel Neuroimaging Model for Investigating Brain Connectome Development
    Hu, Yingtian
    Zeydabadinezhad, Mahmoud
    Li, Longchuan
    Guo, Ying
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2022, 117 (539) : 1134 - 1148
  • [4] Brain Functional Connectivity Analysis via Graphical Deep Learning
    Qu, Gang
    Hu, Wenxing
    Xiao, Li
    Wang, Junqi
    Bai, Yuntong
    Patel, Beenish
    Zhang, Kun
    Wang, Yu-Ping
    IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, 2022, 69 (05) : 1696 - 1706
  • [5] DEEP MULTIMODAL BRAIN NETWORK LEARNING FOR JOINT ANALYSIS OF STRUCTURAL MORPHOMETRY AND FUNCTIONAL CONNECTIVITY
    Zhang, Wen
    Wang, Yalin
    2020 IEEE 17TH INTERNATIONAL SYMPOSIUM ON BIOMEDICAL IMAGING (ISBI 2020), 2020, : 1924 - 1928
  • [6] The Developmental Chronnecto-Genomics (Dev-CoG) study: A multimodal study on the developing brain
    Stephen, J. M.
    Solis, I
    Janowich, J.
    Stern, M.
    Frenzel, M. R.
    Eastman, J. A.
    Mills, M. S.
    Embury, C. M.
    Coolidge, N. M.
    Heinrichs-Graham, E.
    Mayer, A.
    Liu, J.
    Wang, Y. P.
    Wilson, T. W.
    Calhoun, V. D.
    NEUROIMAGE, 2021, 225
  • [7] Self-supervised multimodal learning for group inferences from MRI data: Discovering disorder-relevant brain regions and multimodal links
    Fedorov, Alex
    Geenjaar, Eloy
    Wu, Lei
    Sylvain, Tristan
    DeRamus, Thomas P.
    Luck, Margaux
    Misiura, Maria
    Mittapalle, Girish
    Hjelm, R. Devon
    Plis, Sergey M.
    Calhoun, Vince D.
    NEUROIMAGE, 2024, 285
  • [8] Multimodal Infant Brain Segmentation by Fuzzy-Informed Deep Learning
    Ding, Weiping
    Abdel-Basset, Mohamed
    Hawash, Hossam
    Pedrycz, Witold
    IEEE TRANSACTIONS ON FUZZY SYSTEMS, 2022, 30 (04) : 1088 - 1101
  • [9] Breath hold effect on cardiovascular brain pulsations - A multimodal magnetic resonance encephalography study
    Raitamaa, Lauri
    Korhonen, Vesa
    Huotari, Niko
    Raatikainen, Ville
    Hautaniemi, Taneli
    Kananen, Janne
    Rasila, Aleksi
    Helakari, Heta
    Zienkiewicz, Aleksandra
    Myllyla, Teemu
    Borchardt, Viola
    Kiviniemi, Vesa
    JOURNAL OF CEREBRAL BLOOD FLOW AND METABOLISM, 2019, 39 (12) : 2471 - 2485
  • [10] Multimodal brain predictors of current weight and weight gain in children enrolled in the ABCD study ®
    Adise, Shana
    Allgaier, Nicholas
    Laurent, Jennifer
    Hahn, Sage
    Chaarani, Bader
    Owens, Max
    Yuan, DeKang
    Nyugen, Philip
    Mackey, Scott
    Potter, Alexandra
    Garavan, Hugh P.
    DEVELOPMENTAL COGNITIVE NEUROSCIENCE, 2021, 49