Characterizing Secretion System Effector Proteins With Structure-Aware Graph Neural Networks and Pre-Trained Language Models

被引:1
|
作者
Ran, Zixu [1 ]
Wang, Cong [1 ]
Sun, Heyun [2 ]
Pan, Shirui [3 ]
Li, Fuyi [1 ]
机构
[1] Northwest A&F Univ, Coll Informat Engn, Xianyang 712100, Peoples R China
[2] Univ Adelaide, South Australian immunoGEN Canc Inst SAiGENCI, Adelaide, SA 5000, Australia
[3] Griffith Univ, Sch Informat & Commun Technol, Brisbane, Qld 4222, Australia
基金
中国国家自然科学基金;
关键词
Proteins; Feature extraction; Three-dimensional displays; Amino acids; Solvents; Bioinformatics; Benchmark testing; Deep learning; host-pathogen interaction; protein 3D structure; secreted protein; III SECRETION; MIMICRY;
D O I
10.1109/JBHI.2024.3413146
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The Type III Secretion Systems (T3SSs) play a pivotal role in host-pathogen interactions by mediating the secretion of type III secretion system effectors (T3SEs) into host cells. These T3SEs mimic host cell protein functions, influencing interactions between Gram-negative bacterial pathogens and their hosts. Identifying T3SEs is essential in biomedical research for comprehending bacterial pathogenesis and its implications on human cells. This study presents EDIFIER, a novel multi-channel model designed for accurate T3SE prediction. It incorporates a graph structural channel, utilizing graph convolutional networks (GCN) to capture protein 3D structural features and a sequence channel based on the ProteinBERT pre-trained model to extract the sequence context features of T3SEs. Rigorous benchmarking tests, including ablation studies and comparative analysis, validate that EDIFIER outperforms current state-of-the-art tools in T3SE prediction. To enhance EDIFIER's accessibility to the broader scientific community, we developed a webserver that is publicly accessible at http://edifier.unimelb-biotools.cloud.edu.au/. We anticipate EDIFIER will contribute to the field by providing reliable T3SE predictions, thereby advancing our understanding of host-pathogen dynamics.
引用
收藏
页码:5649 / 5657
页数:9
相关论文
共 18 条
  • [1] Structure-Aware Multi-Hop Graph Convolution for Graph Neural Networks
    Li, Yang
    Tanaka, Yuichi
    IEEE ACCESS, 2022, 10 : 16624 - 16633
  • [2] Modeling Second Language Acquisition with pre-trained neural language models
    Palenzuela, Alvaro J. Jimenez
    Frasincar, Flavius
    Trusca, Maria Mihaela
    EXPERT SYSTEMS WITH APPLICATIONS, 2022, 207
  • [3] Delayed Bottlenecking: Alleviating Forgetting in Pre-trained Graph Neural Networks
    Zhao, Zhe
    Wang, Pengkun
    Wang, Xu
    Wen, Haibin
    Xie, Xiaolong
    Zhou, Zhengyang
    Zhang, Qingfu
    Wang, Yang
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2025, 37 (03) : 1140 - 1153
  • [4] Comprehensive Research on Druggable Proteins: From PSSM to Pre-Trained Language Models
    Chu, Hongkang
    Liu, Taigang
    INTERNATIONAL JOURNAL OF MOLECULAR SCIENCES, 2024, 25 (08)
  • [5] Improving chemical reaction yield prediction using pre-trained graph neural networks
    Han, Jongmin
    Kwon, Youngchun
    Choi, Youn-Suk
    Kang, Seokho
    JOURNAL OF CHEMINFORMATICS, 2024, 16 (01)
  • [6] Improving chemical reaction yield prediction using pre-trained graph neural networks
    Jongmin Han
    Youngchun Kwon
    Youn-Suk Choi
    Seokho Kang
    Journal of Cheminformatics, 16
  • [7] ENHANCING SEMANTIC WEB ENTITY MATCHING PROCESS USING TRANSFORMER NEURAL NETWORKS AND PRE-TRAINED LANGUAGE MODELS
    Jabrane, Mourad
    Toulaoui, Abdelfattah
    Hafidi, Imad
    COMPUTING AND INFORMATICS, 2024, 43 (06) : 1397 - 1415
  • [8] Porter 6: Protein Secondary Structure Prediction by Leveraging Pre-Trained Language Models (PLMs)
    Alanazi, Wafa
    Meng, Di
    Pollastri, Gianluca
    INTERNATIONAL JOURNAL OF MOLECULAR SCIENCES, 2025, 26 (01)
  • [9] Incident detection and classification in renewable energy news using pre-trained language models on deep neural networks
    Wang, Qiqing
    Li, Cunbin
    JOURNAL OF COMPUTATIONAL METHODS IN SCIENCES AND ENGINEERING, 2022, 22 (01) : 57 - 76
  • [10] SGCSumm: An extractive multi-document summarization method based on pre-trained language model, submodularity, and graph convolutional neural networks
    Ghadimi, Alireza
    Beigy, Hamid
    EXPERT SYSTEMS WITH APPLICATIONS, 2023, 215