PMMS: Predicting essential miRNAs based on multi-head self-attention mechanism and sequences

被引:2
|
作者
Yan, Cheng [1 ,2 ]
Ding, Changsong [1 ]
Duan, Guihua [3 ]
机构
[1] Hunan Univ Chinese Med, Sch Informat, Changsha, Peoples R China
[2] Qiannan Normal Univ Nationalities, Sch Comp & Informat, Duyun, Peoples R China
[3] Cent South Univ, Sch Comp Sci & Engn, Changsha, Peoples R China
基金
中国国家自然科学基金;
关键词
microRNA; essential miRNA; bi-directional long short-term memory; multi-head self-attention mechanism; deep learning; MICRORNAS; DATABASE; TARGET;
D O I
10.3389/fmed.2022.1015278
中图分类号
R5 [内科学];
学科分类号
1002 ; 100201 ;
摘要
Increasing evidence has proved that miRNA plays a significant role in biological progress. In order to understand the etiology and mechanisms of various diseases, it is necessary to identify the essential miRNAs. However, it is time-consuming and expensive to identify essential miRNAs by using traditional biological experiments. It is critical to develop computational methods to predict potential essential miRNAs. In this study, we provided a new computational method (called PMMS) to identify essential miRNAs by using multi-head self-attention and sequences. First, PMMS computes the statistic and structure features and extracts the static feature by concatenating them. Second, PMMS extracts the deep learning original feature (BiLSTM-based feature) by using bi-directional long short-term memory (BiLSTM) and pre-miRNA sequences. In addition, we further obtained the multi-head self-attention feature (MS-based feature) based on BiLSTM-based feature and multi-head self-attention mechanism. By considering the importance of the subsequence of pre-miRNA to the static feature of miRNA, we obtained the deep learning final feature (WA-based feature) based on the weighted attention mechanism. Finally, we concatenated WA-based feature and static feature as an input to the multilayer perceptron) model to predict essential miRNAs. We conducted five-fold cross-validation to evaluate the prediction performance of PMMS. The areas under the ROC curves (AUC), the F1-score, and accuracy (ACC) are used as performance metrics. From the experimental results, PMMS obtained best prediction performances (AUC: 0.9556, F1-score: 0.9030, and ACC: 0.9097). It also outperformed other compared methods. The experimental results also illustrated that PMMS is an effective method to identify essential miRNA.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] Epilepsy detection based on multi-head self-attention mechanism
    Ru, Yandong
    An, Gaoyang
    Wei, Zheng
    Chen, Hongming
    PLOS ONE, 2024, 19 (06):
  • [2] Arrhythmia classification algorithm based on multi-head self-attention mechanism
    Wang, Yue
    Yang, Guanci
    Li, Shaobo
    Li, Yang
    He, Ling
    Liu, Dan
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2023, 79
  • [3] Speech enhancement method based on the multi-head self-attention mechanism
    Chang X.
    Zhang Y.
    Yang L.
    Kou J.
    Wang X.
    Xu D.
    Xi'an Dianzi Keji Daxue Xuebao/Journal of Xidian University, 2020, 47 (01): : 104 - 110
  • [4] Text summarization based on multi-head self-attention mechanism and pointer network
    Qiu, Dong
    Yang, Bing
    COMPLEX & INTELLIGENT SYSTEMS, 2022, 8 (01) : 555 - 567
  • [5] Text summarization based on multi-head self-attention mechanism and pointer network
    Dong Qiu
    Bing Yang
    Complex & Intelligent Systems, 2022, 8 : 555 - 567
  • [6] Deep Bug Triage Model Based on Multi-head Self-attention Mechanism
    Yu, Xu
    Wan, Fayang
    Tang, Bin
    Zhan, Dingjia
    Peng, Qinglong
    Yu, Miao
    Wang, Zhaozhe
    Cui, Shuang
    COMPUTER SUPPORTED COOPERATIVE WORK AND SOCIAL COMPUTING, CHINESECSCW 2021, PT II, 2022, 1492 : 107 - 119
  • [7] Adaptive Pruning for Multi-Head Self-Attention
    Messaoud, Walid
    Trabelsi, Rim
    Cabani, Adnane
    Abdelkefi, Fatma
    ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING, ICAISC 2023, PT II, 2023, 14126 : 48 - 57
  • [8] Multi-label Text Classification Based on BiGRU and Multi-Head Self-Attention Mechanism
    Luo, Tongtong
    Shi, Nan
    Jin, Meilin
    Qin, Aolong
    Tang, Jiacheng
    Wang, Xihan
    Gao, Quanli
    Shao, Lianhe
    2024 3RD INTERNATIONAL CONFERENCE ON IMAGE PROCESSING AND MEDIA COMPUTING, ICIPMC 2024, 2024, : 204 - 210
  • [9] Neural News Recommendation with Multi-Head Self-Attention
    Wu, Chuhan
    Wu, Fangzhao
    Ge, Suyu
    Qi, Tao
    Huang, Yongfeng
    Xie, Xing
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 6389 - 6394
  • [10] A Bearing Fault Diagnosis Method Based on Dilated Convolution and Multi-Head Self-Attention Mechanism
    Hou, Peng
    Zhang, Jianjie
    Jiang, Zhangzheng
    Tang, Yiyu
    Lin, Ying
    APPLIED SCIENCES-BASEL, 2023, 13 (23):