Exploring Implicit Biological Heterogeneity in ASD Diagnosis Using a Multi-Head Attention Graph Neural Network

被引:0
|
作者
Moon, Hyung-Jun [1 ]
Cho, Sung-Bae [2 ]
机构
[1] Yonsei Univ, Dept Artificial Intelligence, Seoul 03722, South Korea
[2] Yonsei Univ, Dept Comp Sci, Seoul 03722, South Korea
关键词
autism spectrum disorder; dynamic functional connectivity; graph neural network; multi-head attention; AUTISM; AMYGDALA; CHILDREN; CLASSIFICATION; PREDICTION; MICROGLIA;
D O I
10.31083/j.jin2307135
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
Background: Autism spectrum disorder (ASD) is a neurodevelopmental disorder exhibiting heterogeneous characteristics in patients, including variability in developmental progression and distinct neuroanatomical features influenced by sex and age. Recent advances in deep learning models based on functional connectivity (FC) graphs have produced promising results, but they have focused on generalized global activation patterns and failed to capture specialized regional characteristics and accurately assess disease indications. Methods: To overcome these limitations, we propose a novel deep learning method that models FC with multi-head attention, which enables simultaneous modeling of the intricate and variable patterns of brain connectivity associated with ASD, effectively extracting abnormal patterns of brain connectivity. The proposed method not only identifies region-specific correlations but also emphasizes connections at specific, transient time points from diverse perspectives. The extracted FC is transformed into a graph, assigning weighted labels to the edges to reflect the degree of correlation, which is then processed using a graph neural network capable of handling edge labels. Results: Experiments on the autism brain imaging data exchange (ABIDE) I and II datasets, which include a heterogeneous cohort, showed superior performance over the state-of-the-art methods, improving accuracy by up to 3.7%p. The incorporation of multi-head attention in FC analysis markedly improved the distinction between typical brains and those affected by ASD. Additionally, the ablation study validated diverse brain characteristics in ASD patients across different ages and sexes, offering insightful interpretations. Conclusion: These results emphasize the effectiveness of the method in enhancing diagnostic accuracy and its potential in advancing neurological research for ASD diagnosis.
引用
收藏
页数:14
相关论文
共 50 条
  • [41] Capsule Network Improved Multi-Head Attention for Word Sense Disambiguation
    Cheng, Jinfeng
    Tong, Weiqin
    Yan, Weian
    APPLIED SCIENCES-BASEL, 2021, 11 (06):
  • [42] Gaze Estimation Network Based on Multi-Head Attention, Fusion, and Interaction
    Li, Changli
    Li, Fangfang
    Zhang, Kao
    Chen, Nenglun
    Pan, Zhigeng
    SENSORS, 2025, 25 (06)
  • [43] Multimodal Approach of Speech Emotion Recognition Using Multi-Level Multi-Head Fusion Attention-Based Recurrent Neural Network
    Ngoc-Huynh Ho
    Yang, Hyung-Jeong
    Kim, Soo-Hyung
    Lee, Gueesang
    IEEE ACCESS, 2020, 8 : 61672 - 61686
  • [44] Identify influential nodes in social networks with graph multi-head attention regression model
    Kou, Jiangheng
    Jia, Peng
    Liu, Jiayong
    Dai, Jinqiao
    Luo, Hairu
    NEUROCOMPUTING, 2023, 530 : 23 - 36
  • [45] Multi-head Attention and Graph Convolutional Networks with Regularized Dropout for Biomedical Relation Extraction
    Huang, Mian
    Wang, Jian
    Lin, Hongfei
    Yang, Zhihao
    HEALTH INFORMATION PROCESSING, CHIP 2023, 2023, 1993 : 98 - 111
  • [46] Hunt for Unseen Intrusion: Multi-Head Self-Attention Neural Detector
    Seo, Seongyun
    Han, Sungmin
    Park, Janghyeon
    Shim, Shinwoo
    Ryu, Han-Eul
    Cho, Byoungmo
    Lee, Sangkyun
    IEEE ACCESS, 2021, 9 : 129635 - 129647
  • [47] A multi-head attention neural network with non-linear correlation approach for time series causal discovery
    Irribarra, Nicolas
    Michell, Kevin
    Bermeo, Cristhian
    Kristjanpoller, Werner
    APPLIED SOFT COMPUTING, 2024, 165
  • [48] A Network Intrusion Detection Model Based on BiLSTM with Multi-Head Attention Mechanism
    Zhang, Jingqi
    Zhang, Xin
    Liu, Zhaojun
    Fu, Fa
    Jiao, Yihan
    Xu, Fei
    ELECTRONICS, 2023, 12 (19)
  • [49] Multi-Head Attention Affinity Diversity Sharing Network for Facial Expression Recognition
    Zheng, Caixia
    Liu, Jiayu
    Zhao, Wei
    Ge, Yingying
    Chen, Wenhe
    ELECTRONICS, 2024, 13 (22)
  • [50] Using Mention Segmentation to Improve Event Detection with Multi-head Attention
    Chen, Jiali
    Hong, Yu
    Zhang, Jingli
    Yao, Jianmin
    PROCEEDINGS OF THE 2019 INTERNATIONAL CONFERENCE ON ASIAN LANGUAGE PROCESSING (IALP), 2019, : 367 - 372