Grain protein function prediction based on self-attention mechanism and bidirectional LSTM

被引:3
作者
Liu, Jing [1 ]
Tang, Xinghua [1 ]
Guan, Xiao [2 ]
机构
[1] Shanghai Maritime Univ, Coll Informat Engn, Shanghai, Peoples R China
[2] Univ Shanghai Sci & Technol, Sch Hlth Sci, Shanghai, Peoples R China
基金
中国国家自然科学基金;
关键词
grain; protein function prediction; deep learning; self-attention; bidirectional long short-term memory; chemical property; CLASSIFICATION; SEQUENCE; ONTOLOGY;
D O I
10.1093/bib/bbac493
中图分类号
Q5 [生物化学];
学科分类号
071010 ; 081704 ;
摘要
With the development of genome sequencing technology, using computing technology to predict grain protein function has become one of the important tasks of bioinformatics. The protein data of four grains, soybean, maize, indica and japonica are selected in this experimental dataset. In this paper, a novel neural network algorithm Chemical-SA-BiLSTM is proposed for grain protein function prediction. The Chemical-SA-BiLSTM algorithm fuses the chemical properties of proteins on the basis of amino acid sequences, and combines the self-attention mechanism with the bidirectional Long Short -Term Memory network. The experimental results show that the Chemical-SA-BiLSTM algorithm is superior to other classical neural network algorithms, and can more accurately predict the protein function, which proves the effectiveness of the Chemical-SA-BiLSTM algorithm in the prediction of grain protein function. The source code of our method is available at https://github.com/HwaTong/Chemical-SA-BiLSTM.
引用
收藏
页数:11
相关论文
共 50 条
  • [31] Multitasks Joint Prediction of Fuel Cells Based on Self-Attention Residual Network
    Wu, Yufeng
    Yuan, Hao
    Liu, Zhaoming
    Wei, Xuezhe
    Dai, Haifeng
    IEEE TRANSACTIONS ON TRANSPORTATION ELECTRIFICATION, 2024, 10 (03): : 6867 - 6879
  • [32] Prediction and scheduling of multi-energy microgrid based on BiGRU self-attention mechanism and LQPSO
    Duan, Yuchen
    Li, Peng
    Xia, Jing
    GLOBAL ENERGY INTERCONNECTION-CHINA, 2024, 7 (03): : 347 - 361
  • [33] Self-Attention based encoder-Decoder for multistep human density prediction
    Violos, John
    Theodoropoulos, Theodoros
    Maroudis, Angelos-Christos
    Leivadeas, Aris
    Tserpes, Konstantinos
    JOURNAL OF URBAN MOBILITY, 2022, 2
  • [34] Leveraging Self-Attention Mechanism for Attitude Estimation in Smartphones
    Brotchie, James
    Shao, Wei
    Li, Wenchao
    Kealy, Allison
    SENSORS, 2022, 22 (22)
  • [35] Pyramid self-attention mechanism-based change detection in hyperspectral imagery
    Wang, Guanghui
    Peng, Yaoyao
    Zhang, Shubi
    Wang, Geng
    Zhang, Tao
    Qi, Jianwei
    Zheng, Shulei
    Liu, Yu
    JOURNAL OF APPLIED REMOTE SENSING, 2021, 15 (04)
  • [36] Spatiotemporal module for video saliency prediction based on self-attention
    Wang, Yuhao
    Liu, Zhuoran
    Xia, Yibo
    Zhu, Chunbo
    Zhao, Danpei
    IMAGE AND VISION COMPUTING, 2021, 112
  • [37] Sparse Coding Inspired LSTM and Self-Attention Integration for Medical Image Segmentation
    Ji, Zexuan
    Ye, Shunlong
    Ma, Xiao
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2024, 33 : 6098 - 6113
  • [38] Microblog Sentiment Classification Method Based on Dual Attention Mechanism and Bidirectional LSTM
    Wei, Wenjie
    Zhang, Yangsen
    Duan, Ruixue
    Zhang, Wen
    CHINESE LEXICAL SEMANTICS (CLSW 2019), 2020, 11831 : 309 - 320
  • [39] A self-attention model for viewport prediction based on distance constraint
    Lan, ChengDong
    Qiu, Xu
    Miao, Chenqi
    Zheng, MengTing
    VISUAL COMPUTER, 2024, 40 (09) : 5997 - 6014
  • [40] Pedestrian Attribute Recognition Based on Dual Self-attention Mechanism
    Fan, Zhongkui
    Guan, Ye-peng
    COMPUTER SCIENCE AND INFORMATION SYSTEMS, 2023, 20 (02) : 793 - 812