Grain protein function prediction based on self-attention mechanism and bidirectional LSTM

被引:3
作者
Liu, Jing [1 ]
Tang, Xinghua [1 ]
Guan, Xiao [2 ]
机构
[1] Shanghai Maritime Univ, Coll Informat Engn, Shanghai, Peoples R China
[2] Univ Shanghai Sci & Technol, Sch Hlth Sci, Shanghai, Peoples R China
基金
中国国家自然科学基金;
关键词
grain; protein function prediction; deep learning; self-attention; bidirectional long short-term memory; chemical property; CLASSIFICATION; SEQUENCE; ONTOLOGY;
D O I
10.1093/bib/bbac493
中图分类号
Q5 [生物化学];
学科分类号
071010 ; 081704 ;
摘要
With the development of genome sequencing technology, using computing technology to predict grain protein function has become one of the important tasks of bioinformatics. The protein data of four grains, soybean, maize, indica and japonica are selected in this experimental dataset. In this paper, a novel neural network algorithm Chemical-SA-BiLSTM is proposed for grain protein function prediction. The Chemical-SA-BiLSTM algorithm fuses the chemical properties of proteins on the basis of amino acid sequences, and combines the self-attention mechanism with the bidirectional Long Short -Term Memory network. The experimental results show that the Chemical-SA-BiLSTM algorithm is superior to other classical neural network algorithms, and can more accurately predict the protein function, which proves the effectiveness of the Chemical-SA-BiLSTM algorithm in the prediction of grain protein function. The source code of our method is available at https://github.com/HwaTong/Chemical-SA-BiLSTM.
引用
收藏
页数:11
相关论文
共 50 条
  • [21] Method of Voltage Sag Causes Based on Bidirectional LSTM and Attention Mechanism
    Hong Wang
    Linhai Qi
    Yongshuo Ma
    Jiehui Jia
    Zhicong Zheng
    Journal of Electrical Engineering & Technology, 2020, 15 : 1115 - 1125
  • [22] Pedestrian Trajectory Prediction Model Based on Self-Attention Mechanism and Group Behavior Characteristics
    Zhou Y.
    Wu H.
    Cheng H.
    Zheng J.
    Li X.
    Wuhan Daxue Xuebao (Xinxi Kexue Ban)/Geomatics and Information Science of Wuhan University, 2020, 45 (12): : 1989 - 1996
  • [23] A Deep Learning Method Based Self-Attention and Bi-directional LSTM in Emotion Classification
    Fei, Rong
    Zhu, Yuanbo
    Yao, Quanzhu
    Xu, Qingzheng
    Hu, Bo
    JOURNAL OF INTERNET TECHNOLOGY, 2020, 21 (05): : 1447 - 1461
  • [24] Sparse Self-Attention LSTM for Sentiment Lexicon Construction
    Deng, Dong
    Jing, Liping
    Yu, Jian
    Sun, Shaolong
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2019, 27 (11) : 1777 - 1790
  • [25] Hashtag Recommendation Using LSTM Networks with Self-Attention
    Shen, Yatian
    Li, Yan
    Sun, Jun
    Ding, Wenke
    Shi, Xianjin
    Zhang, Lei
    Shen, Xiajiong
    He, Jing
    CMC-COMPUTERS MATERIALS & CONTINUA, 2019, 61 (03): : 1261 - 1269
  • [26] Binary Function Similarity Detection Based on Graph Neural Network with Self-Attention Mechanism
    Wu, Dingjie
    He, Xuanzhang
    Zhang, Yao
    Zhu, Junjie
    Zhang, Xinyuan
    Ye, Minchao
    Gao, Zhigang
    2022 IEEE INTL CONF ON DEPENDABLE, AUTONOMIC AND SECURE COMPUTING, INTL CONF ON PERVASIVE INTELLIGENCE AND COMPUTING, INTL CONF ON CLOUD AND BIG DATA COMPUTING, INTL CONF ON CYBER SCIENCE AND TECHNOLOGY CONGRESS (DASC/PICOM/CBDCOM/CYBERSCITECH), 2022, : 971 - 975
  • [27] Improved multistep ahead photovoltaic power prediction model based on LSTM and self-attention with weather forecast data
    Hu, Zehuan
    Gao, Yuan
    Ji, Siyu
    Mae, Masayuki
    Imaizumi, Taiji
    APPLIED ENERGY, 2024, 359
  • [28] Pest Identification Based on Fusion of Self-Attention With ResNet
    Hassan, Sk Mahmudul
    Maji, Arnab Kumar
    IEEE ACCESS, 2024, 12 : 6036 - 6050
  • [29] Fusion attention mechanism bidirectional LSTM for short-term traffic flow prediction
    Li, Zhihong
    Xu, Han
    Gao, Xiuli
    Wang, Zinan
    Xu, Wangtu
    JOURNAL OF INTELLIGENT TRANSPORTATION SYSTEMS, 2024, 28 (04) : 511 - 524
  • [30] GCN-Based LSTM Autoencoder with Self-Attention for Bearing Fault Diagnosis
    Lee, Daehee
    Choo, Hyunseung
    Jeong, Jongpil
    SENSORS, 2024, 24 (15)