Enhancing Multimodal Patterns in Neuroimaging by Siamese Neural Networks with Self-Attention Mechanism

被引:14
|
作者
Arco, Juan E. [1 ,2 ,3 ]
Ortiz, Andres [2 ,3 ]
Gallego-Molina, Nicolas J. [2 ,3 ]
Gorriz, Juan M. [1 ,3 ]
Ramirez, Javier [1 ,3 ]
机构
[1] Univ Granada, Dept Signal Theory Networking & Commun, Granada 18010, Spain
[2] Univ Malaga, Dept Commun Engn, Malaga 29010, Spain
[3] Andalusian Res Inst Data Sci & Computat Intellige, Granada, Spain
关键词
Multimodal combination; siamese neural network; self-attention; deep learning; medical imaging; ALZHEIMERS-DISEASE; FUNCTIONAL CONNECTIVITY; MATTER LOSS; DIAGNOSIS; FUSION; MULTISCALE; MRI;
D O I
10.1142/S0129065723500193
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The combination of different sources of information is currently one of the most relevant aspects in the diagnostic process of several diseases. In the field of neurological disorders, different imaging modalities providing structural and functional information are frequently available. Those modalities are usually analyzed separately, although a joint of the features extracted from both sources can improve the classification performance of Computer-Aided Diagnosis (CAD) tools. Previous studies have computed independent models from each individual modality and combined them in a subsequent stage, which is not an optimum solution. In this work, we propose a method based on the principles of siamese neural networks to fuse information from Magnetic Resonance Imaging (MRI) and Positron Emission Tomography (PET). This framework quantifies the similarities between both modalities and relates them with the diagnostic label during the training process. The resulting latent space at the output of this network is then entered into an attention module in order to evaluate the relevance of each brain region at different stages of the development of Alzheimer's disease. The excellent results obtained and the high flexibility of the method proposed allow fusing more than two modalities, leading to a scalable methodology that can be used in a wide range of contexts.
引用
收藏
页数:18
相关论文
共 50 条
  • [21] Convolution Neural Networks and Self-Attention Learners for Alzheimer Dementia Diagnosis from Brain MRI
    Carcagni, Pierluigi
    Leo, Marco
    Del Coco, Marco
    Distante, Cosimo
    De Salve, Andrea
    SENSORS, 2023, 23 (03)
  • [22] Graph convolutional networks with the self-attention mechanism for adaptive influence maximization in social networks
    Tang, Jianxin
    Song, Shihui
    Du, Qian
    Yao, Yabing
    Qu, Jitao
    COMPLEX & INTELLIGENT SYSTEMS, 2024, 10 (06) : 8383 - 8401
  • [23] A New Siamese Heterogeneous Convolutional Neural Networks Based on Attention Mechanism and Feature Pyramid
    Lu, Zhenyu
    Bian, Yuelou
    Yang, Tingya
    Ge, Quanbo
    Wang, Yuanliang
    IEEE TRANSACTIONS ON CYBERNETICS, 2024, 54 (01) : 13 - 24
  • [24] Prediction of Sea Surface Temperature by Combining Interdimensional and Self-Attention with Neural Networks
    Guo, Xing
    He, Jianghai
    Wang, Biao
    Wu, Jiaji
    REMOTE SENSING, 2022, 14 (19)
  • [25] Recurrent Neural Network Model with Self-Attention Mechanism for Fault Detection and Diagnosis
    Zhang, Rui
    Xiong, Zhihua
    2019 CHINESE AUTOMATION CONGRESS (CAC2019), 2019, : 4706 - 4711
  • [26] Automatic Lyrics Transcription using Dilated Convolutional Neural Networks with Self-Attention
    Demirel, Emir
    Ahlback, Sven
    Dixon, Simon
    2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [27] Speech emotion recognition using recurrent neural networks with directional self-attention
    Li, Dongdong
    Liu, Jinlin
    Yang, Zhuo
    Sun, Linyu
    Wang, Zhe
    EXPERT SYSTEMS WITH APPLICATIONS, 2021, 173
  • [28] Enabling Energy-Efficient Inference for Self-Attention Mechanisms in Neural Networks
    Chen, Qinyu
    Sun, Congyi
    Lu, Zhonghai
    Gao, Chang
    2022 IEEE INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE CIRCUITS AND SYSTEMS (AICAS 2022): INTELLIGENT TECHNOLOGY IN THE POST-PANDEMIC ERA, 2022, : 25 - 28
  • [29] Research on Time Series Prediction via Quantum Self-Attention Neural Networks
    Chen X.
    Li C.
    Jin F.
    Dianzi Keji Daxue Xuebao/Journal of the University of Electronic Science and Technology of China, 2024, 53 (01): : 110 - 118
  • [30] Generating self-attention activation maps for visual interpretations of convolutional neural networks
    Liang, Yu
    Li, Maozhen
    Jiang, Changjun
    NEUROCOMPUTING, 2022, 490 : 206 - 216