PreAlgPro: Prediction of allergenic proteins with pre-trained protein language model and efficient neutral network

被引:1
|
作者
Zhang, Lingrong [1 ]
Liu, Taigang [1 ]
机构
[1] Shanghai Ocean Univ, Coll Informat Technol, Shanghai 201306, Peoples R China
关键词
Pre-trained protein language model; Allergenic proteins; Deep learning; Model interpretability; DATABASE;
D O I
10.1016/j.ijbiomac.2024.135762
中图分类号
Q5 [生物化学]; Q7 [分子生物学];
学科分类号
071010 ; 081704 ;
摘要
Allergy is a prevalent phenomenon, involving allergens such as nuts and milk. Avoiding exposure to allergens is the most effective preventive measure against allergic reactions. However, current homology-based methods for identifying allergenic proteins encounter challenges when dealing with non-homologous data. Traditional machine learning approaches rely on manually extracted features, which lack important protein functional characteristics, including evolutionary information. Consequently, there is still considerable room for improvement in existing methods. In this study, we present PreAlgPro, a method for identifying allergenic proteins based on pre-trained protein language models and deep learning techniques. Specifically, we employed the ProtT5 model to extract protein embedding features, replacing the manual feature extraction step. Furthermore, we devised an Attention-CNN neural network architecture to identify potential features that contribute to the classification of allergenic proteins. The performance of our model was evaluated on four independent test sets, and the experimental results demonstrate that PreAlgPro surpasses existing state-of-the-art methods. Additionally, we collected allergenic protein samples to validate the robustness of the model and conducted an analysis of model interpretability.
引用
收藏
页数:11
相关论文
共 50 条
  • [21] Software Defect Prediction via Generative Adversarial Networks and Pre-Trained Model
    Song, Wei
    Gan, Lu
    Bao, Tie
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2024, 15 (03) : 1196 - 1209
  • [22] SMILESynergy: Anticancer drug synergy prediction based on Transformer pre-trained model
    Zhang L.
    Qin Y.
    Chen M.
    Shengwu Yixue Gongchengxue Zazhi/Journal of Biomedical Engineering, 2023, 40 (03): : 544 - 551
  • [23] Integrating Pre-Trained protein language model and multiple window scanning deep learning networks for accurate identification of secondary active transporters in membrane proteins
    Malik, Muhammad Shahid
    Ou, Yu-Yen
    METHODS, 2023, 220 : 11 - 20
  • [24] PepPFN: protein-peptide binding residues prediction via pre-trained module-based Fourier Network
    Li, Xue
    Cao, Ben
    Ding, Hongzhen
    Kang, Na
    Song, Tao
    2024 IEEE CONFERENCE ON ARTIFICIAL INTELLIGENCE, CAI 2024, 2024, : 1075 - 1080
  • [25] Sequential Brain CT Image Captioning Based on the Pre-Trained Classifiers and a Language Model
    Kong, Jin-Woo
    Oh, Byoung-Doo
    Kim, Chulho
    Kim, Yu-Seop
    APPLIED SCIENCES-BASEL, 2024, 14 (03):
  • [26] An analysis on language transfer of pre-trained language model with cross-lingual post-training
    Son, Suhyune
    Park, Chanjun
    Lee, Jungseob
    Shim, Midan
    Lee, Chanhee
    Jang, Yoonna
    Seo, Jaehyung
    Lim, Jungwoo
    Lim, Heuiseok
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 267
  • [27] Text based personality prediction from multiple social media data sources using pre-trained language model and model averaging
    Christian, Hans
    Suhartono, Derwin
    Chowanda, Andry
    Zamli, Kamal Z.
    JOURNAL OF BIG DATA, 2021, 8 (01)
  • [28] Text based personality prediction from multiple social media data sources using pre-trained language model and model averaging
    Hans Christian
    Derwin Suhartono
    Andry Chowanda
    Kamal Z. Zamli
    Journal of Big Data, 8
  • [29] Talent Supply and Demand Matching Based on Prompt Learning and the Pre-Trained Language Model
    Li, Kunping
    Liu, Jianhua
    Zhuang, Cunbo
    APPLIED SCIENCES-BASEL, 2025, 15 (05):
  • [30] A Pre-Trained Deep Learning Model for Fast Online Prediction of Structural Seismic Responses
    Tang, Wei-Jian
    Wang, Dong-Sheng
    Huang, Hai-Bin
    Dai, Jian-Cheng
    Shi, Fan
    INTERNATIONAL JOURNAL OF STRUCTURAL STABILITY AND DYNAMICS, 2024, 24 (14)