Drug-Target Interaction Prediction Using Multi-Head Self-Attention and Graph Attention Network

被引:192
|
作者
Cheng, Zhongjian [1 ]
Yan, Cheng [1 ,2 ]
Wu, Fang-Xiang [3 ]
Wang, Jianxin [1 ]
机构
[1] Cent South Univ, Sch Comp Sci & Engn, Hunan Prov Key Lab Bioinformat, Changsha 410083, Peoples R China
[2] Qiannan Normal Univ Nationalities, Sch Comp & Informat, Duyun 558000, Guizhou, Peoples R China
[3] Univ Saskatchewan, Dept Mech Engn, Div Biomed Engn, Saskatoon, SK S7N 5A9, Canada
基金
中国国家自然科学基金;
关键词
Proteins; Drugs; Predictive models; Amino acids; Feature extraction; Compounds; Biological system modeling; Drug-target interactions; multi-head self-attention; graph attention network;
D O I
10.1109/TCBB.2021.3077905
中图分类号
Q5 [生物化学];
学科分类号
071010 ; 081704 ;
摘要
Identifying drug-target interactions (DTIs) is an important step in the process of new drug discovery and drug repositioning. Accurate predictions for DTIs can improve the efficiency in the drug discovery and development. Although rapid advances in deep learning technologies have generated various computational methods, it is still appealing to further investigate how to design efficient networks for predicting DTIs. In this study, we propose an end-to-end deep learning method (called MHSADTI) to predict DTIs based on the graph attention network and multi-head self-attention mechanism. First, the characteristics of drugs and proteins are extracted by the graph attention network and multi-head self-attention mechanism, respectively. Then, the attention scores are used to consider which amino acid subsequence in a protein is more important for the drug to predict its interactions. Finally, we predict DTIs by a fully connected layer after obtaining the feature vectors of drugs and proteins. MHSADTI takes advantage of self-attention mechanism for obtaining long-dependent contextual relationship in amino acid sequences and predicting DTI interpretability. More effective molecular characteristics are also obtained by the attention mechanism in graph attention networks. Multiple cross validation experiments are adopted to assess the performance of our MHSADTI. The experiments on four datasets, human, C.elegans, DUD-E and DrugBank show our method outperforms the state-of-the-art methods in terms of AUC, Precision, Recall, AUPR and F1-score. In addition, the case studies further demonstrate that our method can provide effective visualizations to interpret the prediction results from biological insights.
引用
收藏
页码:2208 / 2218
页数:11
相关论文
共 50 条
  • [31] Text summarization based on multi-head self-attention mechanism and pointer network
    Qiu, Dong
    Yang, Bing
    COMPLEX & INTELLIGENT SYSTEMS, 2022, 8 (01) : 555 - 567
  • [32] Text summarization based on multi-head self-attention mechanism and pointer network
    Dong Qiu
    Bing Yang
    Complex & Intelligent Systems, 2022, 8 : 555 - 567
  • [33] MSnet: Multi-Head Self-Attention Network for Distantly Supervised Relation Extraction
    Sun, Tingting
    Zhang, Chunhong
    Ji, Yang
    Hu, Zheng
    IEEE ACCESS, 2019, 7 : 54472 - 54482
  • [34] A Supervised Multi-Head Self-Attention Network for Nested Named Entity Recognition
    Xu, Yongxiu
    Huang, Heyan
    Feng, Chong
    Hu, Yue
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 14185 - 14193
  • [35] Identifying Drug-Target Interactions Through a Combined Graph Attention Mechanism and Self-attention Sequence Embedding Model
    Wang, Kang
    Hu, Jing
    Zhang, Xiaolong
    ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, ICIC 2023, PT III, 2023, 14088 : 246 - 257
  • [36] DILATED RESIDUAL NETWORK WITH MULTI-HEAD SELF-ATTENTION FOR SPEECH EMOTION RECOGNITION
    Li, Runnan
    Wu, Zhiyong
    Jia, Jia
    Zhao, Sheng
    Meng, Helen
    2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 6675 - 6679
  • [37] NFSA-DTI: A Novel Drug-Target Interaction Prediction Model Using Neural Fingerprint and Self-Attention Mechanism
    Liu, Feiyang
    Xu, Huang
    Cui, Peng
    Li, Shuo
    Wang, Hongbo
    Wu, Ziye
    INTERNATIONAL JOURNAL OF MOLECULAR SCIENCES, 2024, 25 (21)
  • [38] Using recurrent neural network structure with Enhanced Multi-Head Self-Attention for sentiment analysis
    Xue-Liang Leng
    Xiao-Ai Miao
    Tao Liu
    Multimedia Tools and Applications, 2021, 80 : 12581 - 12600
  • [39] Using recurrent neural network structure with Enhanced Multi-Head Self-Attention for sentiment analysis
    Leng, Xue-Liang
    Miao, Xiao-Ai
    Liu, Tao
    MULTIMEDIA TOOLS AND APPLICATIONS, 2021, 80 (08) : 12581 - 12600
  • [40] Detection of malicious URLs using Temporal Convolutional Network and Multi-Head Self-Attention mechanism
    Nguyet Quang Do
    Selamat, Ali
    Krejcar, Ondrej
    Fujita, Hamido
    APPLIED SOFT COMPUTING, 2025, 169