MucLiPred: Multi-Level Contrastive Learning for Predicting Nucleic Acid Binding Residues of Proteins

被引:3
|
作者
Zhang, Jiashuo [1 ]
Wang, Ruheng [1 ]
Wei, Leyi [2 ,3 ]
机构
[1] Shandong Univ, Sch Software, Jinan 250101, Peoples R China
[2] Shandong Univ, Joint SDU NTU Ctr Artificial Intelligence Res C F, Jinan 250101, Peoples R China
[3] Macao Polytech Univ, Fac Appl Sci, Taipei 999078, Macao, Taiwan
关键词
SEQUENCE-BASED PREDICTION; IDENTIFICATION; SITES;
D O I
10.1021/acs.jcim.3c01471
中图分类号
R914 [药物化学];
学科分类号
100701 ;
摘要
Protein-molecule interactions play a crucial role in various biological functions, with their accurate prediction being pivotal for drug discovery and design processes. Traditional methods for predicting protein-molecule interactions are limited. Some can only predict interactions with a specific molecule, restricting their applicability, while others target multiple molecule types but fail to efficiently process diverse interaction information, leading to complexity and inefficiency. This study presents a novel deep learning model, MucLiPred, equipped with a dual contrastive learning mechanism aimed at improving the prediction of multiple molecule-protein interactions and the identification of potential molecule-binding residues. The residue-level paradigm focuses on differentiating binding from non-binding residues, illuminating detailed local interactions. The type-level paradigm, meanwhile, analyzes overarching contexts of molecule types, like DNA or RNA, ensuring that representations of identical molecule types gravitate closer in the representational space, bolstering the model's proficiency in discerning interaction motifs. This dual approach enables comprehensive multi-molecule predictions, elucidating the relationships among different molecule types and strengthening precise protein-molecule interaction predictions. Empirical evidence demonstrates MucLiPred's superiority over existing models in robustness and prediction accuracy. The integration of dual contrastive learning techniques amplifies its capability to detect potential molecule-binding residues with precision. Further optimization, separating representational and classification tasks, has markedly improved its performance. MucLiPred thus represents a significant advancement in protein-molecule interaction prediction, setting a new precedent for future research in this field.
引用
收藏
页码:1050 / 1065
页数:16
相关论文
共 50 条
  • [1] NCBRPred: predicting nucleic acid binding residues in proteins based on multilabel learning
    Zhang, Jun
    Chen, Qingcai
    Liu, Bin
    BRIEFINGS IN BIOINFORMATICS, 2021, 22 (05)
  • [2] Multi-level graph contrastive learning
    Shao, Pengpeng
    Tao, Jianhua
    NEUROCOMPUTING, 2024, 570
  • [3] Multi-Level Graph Knowledge Contrastive Learning
    Yang, Haoran
    Wang, Yuhao
    Zhao, Xiangyu
    Chen, Hongxu
    Yin, Hongzhi
    Li, Qing
    Xu, Guandong
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2024, 36 (12) : 8829 - 8841
  • [4] Multi-level Contrastive Learning for Keyphrase Generation
    Li, Yafu
    Li, Shinian
    Yu, Heng
    Zhuang, Wu
    ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, PT III, ICIC 2024, 2024, 14877 : 238 - 249
  • [5] Multi-level Contrastive Learning for Commonsense Question Answering
    Fang, Quntian
    Huang, Zhen
    Zhang, Ziwen
    Hu, Minghao
    Hu, Biao
    Wang, Ankun
    Li, Dongsheng
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT IV, KSEM 2023, 2023, 14120 : 318 - 331
  • [6] Multi-level Contrastive Learning Framework for Sequential Recommendation
    Wang, Ziyang
    Liu, Huoyu
    Wei, Wei
    Hu, Yue
    Mao, Xian-Ling
    He, Shaojian
    Fang, Rui
    Chen, Dangyang
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022, 2022, : 2098 - 2107
  • [7] Multi-level patterning nucleic acid photolithography
    Kathrin Hölz
    Erika Schaudy
    Jory Lietard
    Mark M. Somoza
    Nature Communications, 10
  • [8] Multi-level patterning nucleic acid photolithography
    Hoelz, Kathrin
    Schaudy, Erika
    Lietard, Jory
    Somoza, Mark M.
    NATURE COMMUNICATIONS, 2019, 10 (1)
  • [9] Multi-level Disentangled Contrastive Learning on Heterogeneous Graphs
    Lin, Tong
    Zhou, Cangqi
    Li, Qianmu
    Hu, Dianming
    2024 16TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND COMPUTING, ICMLC 2024, 2024, : 628 - 634
  • [10] Multi-level Feature Learning for Contrastive Multi-view Clustering
    Xu, Jie
    Tang, Huayi
    Ren, Yazhou
    Peng, Liang
    Zhu, Xiaofeng
    He, Lifang
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2022), 2022, : 16030 - 16039