Drug-Target Interaction Prediction Using Multi-Head Self-Attention and Graph Attention Network

被引:192
|
作者
Cheng, Zhongjian [1 ]
Yan, Cheng [1 ,2 ]
Wu, Fang-Xiang [3 ]
Wang, Jianxin [1 ]
机构
[1] Cent South Univ, Sch Comp Sci & Engn, Hunan Prov Key Lab Bioinformat, Changsha 410083, Peoples R China
[2] Qiannan Normal Univ Nationalities, Sch Comp & Informat, Duyun 558000, Guizhou, Peoples R China
[3] Univ Saskatchewan, Dept Mech Engn, Div Biomed Engn, Saskatoon, SK S7N 5A9, Canada
基金
中国国家自然科学基金;
关键词
Proteins; Drugs; Predictive models; Amino acids; Feature extraction; Compounds; Biological system modeling; Drug-target interactions; multi-head self-attention; graph attention network;
D O I
10.1109/TCBB.2021.3077905
中图分类号
Q5 [生物化学];
学科分类号
071010 ; 081704 ;
摘要
Identifying drug-target interactions (DTIs) is an important step in the process of new drug discovery and drug repositioning. Accurate predictions for DTIs can improve the efficiency in the drug discovery and development. Although rapid advances in deep learning technologies have generated various computational methods, it is still appealing to further investigate how to design efficient networks for predicting DTIs. In this study, we propose an end-to-end deep learning method (called MHSADTI) to predict DTIs based on the graph attention network and multi-head self-attention mechanism. First, the characteristics of drugs and proteins are extracted by the graph attention network and multi-head self-attention mechanism, respectively. Then, the attention scores are used to consider which amino acid subsequence in a protein is more important for the drug to predict its interactions. Finally, we predict DTIs by a fully connected layer after obtaining the feature vectors of drugs and proteins. MHSADTI takes advantage of self-attention mechanism for obtaining long-dependent contextual relationship in amino acid sequences and predicting DTI interpretability. More effective molecular characteristics are also obtained by the attention mechanism in graph attention networks. Multiple cross validation experiments are adopted to assess the performance of our MHSADTI. The experiments on four datasets, human, C.elegans, DUD-E and DrugBank show our method outperforms the state-of-the-art methods in terms of AUC, Precision, Recall, AUPR and F1-score. In addition, the case studies further demonstrate that our method can provide effective visualizations to interpret the prediction results from biological insights.
引用
收藏
页码:2208 / 2218
页数:11
相关论文
共 50 条
  • [41] KGAT: Predicting Drug-Target Interaction Based on Knowledge Graph Attention Network
    Wu, Zhenghao
    Zhang, Xiaolong
    Lin, Xiaoli
    INTELLIGENT COMPUTING THEORIES AND APPLICATION, ICIC 2022, PT II, 2022, 13394 : 438 - 450
  • [42] Masked multi-head self-attention for causal speech enhancement
    Nicolson, Aaron
    Paliwal, Kuldip K.
    SPEECH COMMUNICATION, 2020, 125 : 80 - 96
  • [43] OPNet: Optimized multi-head graph attention network for polymer properties prediction
    Wei, Wei
    Fang, Jun
    Yang, Ning
    Li, Qi
    Hu, Lin
    Han, Jie
    Zhao, Lanbo
    AIP ADVANCES, 2024, 14 (08)
  • [44] Epilepsy detection based on multi-head self-attention mechanism
    Ru, Yandong
    An, Gaoyang
    Wei, Zheng
    Chen, Hongming
    PLOS ONE, 2024, 19 (06):
  • [45] Neural Linguistic Steganalysis via Multi-Head Self-Attention
    Jiao, Sai-Mei
    Wang, Hai-feng
    Zhang, Kun
    Hu, Ya-qi
    JOURNAL OF ELECTRICAL AND COMPUTER ENGINEERING, 2021, 2021 (2021)
  • [46] Personalized News Recommendation with CNN and Multi-Head Self-Attention
    Li, Aibin
    He, Tingnian
    Guo, Yi
    Li, Zhuoran
    Rong, Yixuan
    Liu, Guoqi
    2022 IEEE 13TH ANNUAL UBIQUITOUS COMPUTING, ELECTRONICS & MOBILE COMMUNICATION CONFERENCE (UEMCON), 2022, : 102 - 108
  • [47] Multi-modal multi-head self-attention for medical VQA
    Joshi, Vasudha
    Mitra, Pabitra
    Bose, Supratik
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 83 (14) : 42585 - 42608
  • [48] Joint extraction of entities and relations based on character graph convolutional network and Multi-Head Self-Attention Mechanism
    Meng, Zhao
    Tian, Shengwei
    Yu, Long
    Lv, Yalong
    JOURNAL OF EXPERIMENTAL & THEORETICAL ARTIFICIAL INTELLIGENCE, 2021, 33 (02) : 349 - 362
  • [49] A HYBRID TEXT NORMALIZATION SYSTEM USING MULTI-HEAD SELF-ATTENTION FOR MANDARIN
    Zhang, Junhui
    Pan, Junjie
    Yin, Xiang
    Li, Chen
    Liu, Shichao
    Zhang, Yang
    Wang, Yuxuan
    Ma, Zejun
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 6694 - 6698
  • [50] Multi-modal multi-head self-attention for medical VQA
    Vasudha Joshi
    Pabitra Mitra
    Supratik Bose
    Multimedia Tools and Applications, 2024, 83 : 42585 - 42608