Drug-Target Interaction Prediction Using Multi-Head Self-Attention and Graph Attention Network

被引:192
|
作者
Cheng, Zhongjian [1 ]
Yan, Cheng [1 ,2 ]
Wu, Fang-Xiang [3 ]
Wang, Jianxin [1 ]
机构
[1] Cent South Univ, Sch Comp Sci & Engn, Hunan Prov Key Lab Bioinformat, Changsha 410083, Peoples R China
[2] Qiannan Normal Univ Nationalities, Sch Comp & Informat, Duyun 558000, Guizhou, Peoples R China
[3] Univ Saskatchewan, Dept Mech Engn, Div Biomed Engn, Saskatoon, SK S7N 5A9, Canada
基金
中国国家自然科学基金;
关键词
Proteins; Drugs; Predictive models; Amino acids; Feature extraction; Compounds; Biological system modeling; Drug-target interactions; multi-head self-attention; graph attention network;
D O I
10.1109/TCBB.2021.3077905
中图分类号
Q5 [生物化学];
学科分类号
071010 ; 081704 ;
摘要
Identifying drug-target interactions (DTIs) is an important step in the process of new drug discovery and drug repositioning. Accurate predictions for DTIs can improve the efficiency in the drug discovery and development. Although rapid advances in deep learning technologies have generated various computational methods, it is still appealing to further investigate how to design efficient networks for predicting DTIs. In this study, we propose an end-to-end deep learning method (called MHSADTI) to predict DTIs based on the graph attention network and multi-head self-attention mechanism. First, the characteristics of drugs and proteins are extracted by the graph attention network and multi-head self-attention mechanism, respectively. Then, the attention scores are used to consider which amino acid subsequence in a protein is more important for the drug to predict its interactions. Finally, we predict DTIs by a fully connected layer after obtaining the feature vectors of drugs and proteins. MHSADTI takes advantage of self-attention mechanism for obtaining long-dependent contextual relationship in amino acid sequences and predicting DTI interpretability. More effective molecular characteristics are also obtained by the attention mechanism in graph attention networks. Multiple cross validation experiments are adopted to assess the performance of our MHSADTI. The experiments on four datasets, human, C.elegans, DUD-E and DrugBank show our method outperforms the state-of-the-art methods in terms of AUC, Precision, Recall, AUPR and F1-score. In addition, the case studies further demonstrate that our method can provide effective visualizations to interpret the prediction results from biological insights.
引用
收藏
页码:2208 / 2218
页数:11
相关论文
共 50 条
  • [21] MSIN: An Efficient Multi-head Self-attention Framework for Inertial Navigation
    Shi, Gaotao
    Pan, Bingjia
    Ni, Yuzhi
    ALGORITHMS AND ARCHITECTURES FOR PARALLEL PROCESSING, ICA3PP 2023, PT I, 2024, 14487 : 455 - 473
  • [22] Hunt for Unseen Intrusion: Multi-Head Self-Attention Neural Detector
    Seo, Seongyun
    Han, Sungmin
    Park, Janghyeon
    Shim, Shinwoo
    Ryu, Han-Eul
    Cho, Byoungmo
    Lee, Sangkyun
    IEEE ACCESS, 2021, 9 : 129635 - 129647
  • [23] EEG-Based Emotion Recognition Using Convolutional Recurrent Neural Network with Multi-Head Self-Attention
    Hu, Zhangfang
    Chen, Libujie
    Luo, Yuan
    Zhou, Jingfan
    APPLIED SCIENCES-BASEL, 2022, 12 (21):
  • [24] Prediction of circRNA-Disease Associations Based on the Combination of Multi-Head Graph Attention Network and Graph Convolutional Network
    Cao, Ruifen
    He, Chuan
    Wei, Pijing
    Su, Yansen
    Xia, Junfeng
    Zheng, Chunhou
    BIOMOLECULES, 2022, 12 (07)
  • [25] Chinese CNER Combined with Multi-head Self-attention and BiLSTM-CRF
    Luo X.
    Xia X.
    An Y.
    Chen X.
    Hunan Daxue Xuebao/Journal of Hunan University Natural Sciences, 2021, 48 (04): : 45 - 55
  • [26] Lip Recognition Based on Bi-GRU with Multi-Head Self-Attention
    Ni, Ran
    Jiang, Haiyang
    Zhou, Lu
    Lu, Yuanyao
    ARTIFICIAL INTELLIGENCE APPLICATIONS AND INNOVATIONS, PT III, AIAI 2024, 2024, 713 : 99 - 110
  • [27] Multi-Modal Fusion Network with Multi-Head Self-Attention for Injection Training Evaluation in Medical Education
    Li, Zhe
    Kanazuka, Aya
    Hojo, Atsushi
    Nomura, Yukihiro
    Nakaguchi, Toshiya
    ELECTRONICS, 2024, 13 (19)
  • [28] A Multi-tab Webpage Fingerprinting Method Based on Multi-head Self-attention
    Xie, Lixia
    Li, Yange
    Yang, Hongyu
    Hu, Ze
    Wang, Peng
    Cheng, Xiang
    Zhang, Liang
    FRONTIERS IN CYBER SECURITY, FCS 2023, 2024, 1992 : 131 - 140
  • [29] Multi-Head Self-Attention for 3D Point Cloud Classification
    Gao, Xue-Yao
    Wang, Yan-Zhao
    Zhang, Chun-Xiang
    Lu, Jia-Qi
    IEEE ACCESS, 2021, 9 : 18137 - 18147
  • [30] Learning Contextual Features with Multi-head Self-attention for Fake News Detection
    Wang, Yangqian
    Han, Hao
    Ding, Ye
    Wang, Xuan
    Liao, Qing
    COGNITIVE COMPUTING - ICCC 2019, 2019, 11518 : 132 - 142