Drug-Target Interaction Prediction Using Multi-Head Self-Attention and Graph Attention Network

被引:192
|
作者
Cheng, Zhongjian [1 ]
Yan, Cheng [1 ,2 ]
Wu, Fang-Xiang [3 ]
Wang, Jianxin [1 ]
机构
[1] Cent South Univ, Sch Comp Sci & Engn, Hunan Prov Key Lab Bioinformat, Changsha 410083, Peoples R China
[2] Qiannan Normal Univ Nationalities, Sch Comp & Informat, Duyun 558000, Guizhou, Peoples R China
[3] Univ Saskatchewan, Dept Mech Engn, Div Biomed Engn, Saskatoon, SK S7N 5A9, Canada
基金
中国国家自然科学基金;
关键词
Proteins; Drugs; Predictive models; Amino acids; Feature extraction; Compounds; Biological system modeling; Drug-target interactions; multi-head self-attention; graph attention network;
D O I
10.1109/TCBB.2021.3077905
中图分类号
Q5 [生物化学];
学科分类号
071010 ; 081704 ;
摘要
Identifying drug-target interactions (DTIs) is an important step in the process of new drug discovery and drug repositioning. Accurate predictions for DTIs can improve the efficiency in the drug discovery and development. Although rapid advances in deep learning technologies have generated various computational methods, it is still appealing to further investigate how to design efficient networks for predicting DTIs. In this study, we propose an end-to-end deep learning method (called MHSADTI) to predict DTIs based on the graph attention network and multi-head self-attention mechanism. First, the characteristics of drugs and proteins are extracted by the graph attention network and multi-head self-attention mechanism, respectively. Then, the attention scores are used to consider which amino acid subsequence in a protein is more important for the drug to predict its interactions. Finally, we predict DTIs by a fully connected layer after obtaining the feature vectors of drugs and proteins. MHSADTI takes advantage of self-attention mechanism for obtaining long-dependent contextual relationship in amino acid sequences and predicting DTI interpretability. More effective molecular characteristics are also obtained by the attention mechanism in graph attention networks. Multiple cross validation experiments are adopted to assess the performance of our MHSADTI. The experiments on four datasets, human, C.elegans, DUD-E and DrugBank show our method outperforms the state-of-the-art methods in terms of AUC, Precision, Recall, AUPR and F1-score. In addition, the case studies further demonstrate that our method can provide effective visualizations to interpret the prediction results from biological insights.
引用
收藏
页码:2208 / 2218
页数:11
相关论文
共 50 条
  • [21] Multi-Head Spatiotemporal Attention Graph Convolutional Network for Traffic Prediction
    Oluwasanmi, Ariyo
    Aftab, Muhammad Umar
    Qin, Zhiguang
    Sarfraz, Muhammad Shahzad
    Yu, Yang
    Rauf, Hafiz Tayyab
    SENSORS, 2023, 23 (08)
  • [22] Neural News Recommendation with Multi-Head Self-Attention
    Wu, Chuhan
    Wu, Fangzhao
    Ge, Suyu
    Qi, Tao
    Huang, Yongfeng
    Xie, Xing
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 6389 - 6394
  • [23] Research on Enhanced Multi-head Self-Attention Social Recommendation Algorithm Based on Graph Neural Network
    Teng, Yue
    Yang, Kai
    IAENG International Journal of Computer Science, 2024, 51 (07) : 754 - 764
  • [24] A pseudo-label supervised graph fusion attention network for drug-target interaction prediction
    Xie, Yining
    Wang, Xiaodong
    Wang, Pengda
    Bi, Xueyan
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 259
  • [25] A graph multi-head self-attention neural network for the multi-point long-term prediction of sea surface temperature
    Sheng, Longquan
    Xu, LingYu
    Yu, Jie
    Li, ZhuoLin
    REMOTE SENSING LETTERS, 2023, 14 (08) : 786 - 796
  • [26] SPEECH ENHANCEMENT USING SELF-ADAPTATION AND MULTI-HEAD SELF-ATTENTION
    Koizumi, Yuma
    Yatabe, Kohei
    Delcroix, Marc
    Masuyama, Yoshiki
    Takeuchi, Daiki
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 181 - 185
  • [27] MSSTGCN: Multi-Head Self-Attention and Spatial-Temporal Graph Convolutional Network for Multi-Scale Traffic Flow Prediction
    Zong, Xinlu
    Yu, Fan
    Chen, Zhen
    Xia, Xue
    CMC-COMPUTERS MATERIALS & CONTINUA, 2025, 82 (02): : 3517 - 3537
  • [28] Prediction of Drug-Target Affinity Using Attention Neural Network
    Tang, Xin
    Lei, Xiujuan
    Zhang, Yuchen
    INTERNATIONAL JOURNAL OF MOLECULAR SCIENCES, 2024, 25 (10)
  • [29] Multi-head Self-attention Recommendation Model based on Feature Interaction Enhancement
    Yin, Yunfei
    Huang, Caihao
    Sun, Jingqin
    Huang, Faliang
    IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2022), 2022, : 1740 - 1745
  • [30] Attention as Relation: Learning Supervised Multi-head Self-Attention for Relation Extraction
    Liu, Jie
    Chen, Shaowei
    Wang, Bingquan
    Zhang, Jiaxin
    Li, Na
    Xu, Tong
    PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 3787 - 3793