Grain protein function prediction based on self-attention mechanism and bidirectional LSTM

被引:3
作者
Liu, Jing [1 ]
Tang, Xinghua [1 ]
Guan, Xiao [2 ]
机构
[1] Shanghai Maritime Univ, Coll Informat Engn, Shanghai, Peoples R China
[2] Univ Shanghai Sci & Technol, Sch Hlth Sci, Shanghai, Peoples R China
基金
中国国家自然科学基金;
关键词
grain; protein function prediction; deep learning; self-attention; bidirectional long short-term memory; chemical property; CLASSIFICATION; SEQUENCE; ONTOLOGY;
D O I
10.1093/bib/bbac493
中图分类号
Q5 [生物化学];
学科分类号
071010 ; 081704 ;
摘要
With the development of genome sequencing technology, using computing technology to predict grain protein function has become one of the important tasks of bioinformatics. The protein data of four grains, soybean, maize, indica and japonica are selected in this experimental dataset. In this paper, a novel neural network algorithm Chemical-SA-BiLSTM is proposed for grain protein function prediction. The Chemical-SA-BiLSTM algorithm fuses the chemical properties of proteins on the basis of amino acid sequences, and combines the self-attention mechanism with the bidirectional Long Short -Term Memory network. The experimental results show that the Chemical-SA-BiLSTM algorithm is superior to other classical neural network algorithms, and can more accurately predict the protein function, which proves the effectiveness of the Chemical-SA-BiLSTM algorithm in the prediction of grain protein function. The source code of our method is available at https://github.com/HwaTong/Chemical-SA-BiLSTM.
引用
收藏
页数:11
相关论文
共 50 条
  • [41] Underwater image imbalance attenuation compensation based on attention and self-attention mechanism
    Wang, Danxu
    Wei, Yanhui
    Liu, Junnan
    Ouyang, Wenjia
    Zhou, Xilin
    2022 OCEANS HAMPTON ROADS, 2022,
  • [42] Long-Tailed Recognition Based on Self-attention Mechanism
    Feng, Zekai
    Jia, Hong
    Li, Mengke
    ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, PT II, ICIC 2024, 2024, 14876 : 380 - 391
  • [43] Deep Learning-Based Identification of Maize Leaf Diseases Is Improved by an Attention Mechanism: Self-Attention
    Qian, Xiufeng
    Zhang, Chengqi
    Chen, Li
    Li, Ke
    FRONTIERS IN PLANT SCIENCE, 2022, 13
  • [44] Separable Self-Attention Mechanism for Point Cloud Local and Global Feature Modeling
    Wang, Fan
    Wang, Xiaoli
    Lv, Dan
    Zhou, Lumei
    Shi, Gang
    IEEE ACCESS, 2022, 10 : 129823 - 129831
  • [45] Combining bidirectional long short-term memory and self-attention mechanism for code search
    Cao, Ben
    Liu, Jianxun
    CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2023, 35 (10)
  • [46] Protein–protein interaction site prediction by model ensembling with hybrid feature and self-attention
    Hanhan Cong
    Hong Liu
    Yi Cao
    Cheng Liang
    Yuehui Chen
    BMC Bioinformatics, 24
  • [47] EEG-based sleep staging via self-attention based capsule network with Bi-LSTM model
    Chen, Jin
    Han, Zhihui
    Qiao, Heyuan
    Li, Chang
    Peng, Hu
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2023, 86
  • [48] SDNN-PPI: self-attention with deep neural network effect on protein-protein interaction prediction
    Li, Xue
    Han, Peifu
    Wang, Gan
    Chen, Wenqi
    Wang, Shuang
    Song, Tao
    BMC GENOMICS, 2022, 23 (01)
  • [49] SDNN-PPI: self-attention with deep neural network effect on protein-protein interaction prediction
    Xue Li
    Peifu Han
    Gan Wang
    Wenqi Chen
    Shuang Wang
    Tao Song
    BMC Genomics, 23
  • [50] Compact Cloud Detection with Bidirectional Self-Attention Knowledge Distillation
    Chai, Yajie
    Fu, Kun
    Sun, Xian
    Diao, Wenhui
    Yan, Zhiyuan
    Feng, Yingchao
    Wang, Lei
    REMOTE SENSING, 2020, 12 (17)