Recurrent Neural Network for Gene Regulation Network Construction on Time Series Expression Data

被引:0
|
作者
Zhao, Yue [1 ]
Joshi, Pujan [1 ]
Shin, Dong-Guk [1 ]
机构
[1] Univ Connecticut, Comp Sci & Engn Dept, Storrs, CT 06269 USA
来源
2019 IEEE INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICINE (BIBM) | 2019年
关键词
Recurrent Neural Network; Gene Regulation Network; Modeling and Simulation; COMPOUND-MODE; SINGLE;
D O I
10.1109/bibm47256.2019.8983068
中图分类号
Q5 [生物化学];
学科分类号
071010 ; 081704 ;
摘要
We propose a new way of exploring potential transcription factor targets in which the Recurrent Neural Network (RNN) is used to model time series gene expression data. Once the training of the RNN is completed, inference is performed through feeding the RNN artificially constructed signals. These artificial signals emulate the original gene expression data and the transcriptional factor of interest is set to be zero constantly to model the knockout state of the transcription factor. The predicted expression patterns of the other genes from the RNN are then used to measure the likelihood that the gene is regulated by the knocked out transcriptional factor. After repeating the same process for each gene as Transcription Factor in the dataset, we construct a gene regulation network with edge weights assigned. We demonstrate the effectiveness of our model by comparing our method with existing popular approaches. The result shows that our RNN method can identify transcription factor targets with higher accuracies than most of existing approaches. Overall, our RNN model trained on time series gene expression data can be useful for discovering transcription factor targets as well as building a gene regulation network.
引用
收藏
页码:610 / 615
页数:6
相关论文
共 50 条
  • [1] Adaptive Elman Model of Gene Regulation Network Based on Time Series Data
    Cao, Shengxian
    Wang, Yu
    Tang, Zhenhao
    CURRENT BIOINFORMATICS, 2019, 14 (06) : 551 - 561
  • [2] An approach on discretizing time series using recurrent neural network
    Lei, Kuan-Cheok
    Zhang, Xiaohua Douglas
    PROCEEDINGS 2018 IEEE INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICINE (BIBM), 2018, : 2522 - 2526
  • [3] A Recurrent Neural Network Architecture for Failure Prediction in Deep Drawing Sensory Time Series Data
    Meyes, Richard
    Donauer, Johanna
    Schmeing, Andre
    Meisen, Tobias
    47TH SME NORTH AMERICAN MANUFACTURING RESEARCH CONFERENCE (NAMRC 47), 2019, 34 : 789 - 797
  • [4] Prediction of chaotic time series based on the recurrent predictor neural network
    Han, M
    Xi, JH
    Xu, SG
    Yin, FL
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2004, 52 (12) : 3409 - 3416
  • [5] FINANCIAL TIME SERIES PREDICTION MODEL BASED RECURRENT NEURAL NETWORK
    Cheng Chaozhi
    Gao Yachun
    Ni Jingwei
    2020 17TH INTERNATIONAL COMPUTER CONFERENCE ON WAVELET ACTIVE MEDIA TECHNOLOGY AND INFORMATION PROCESSING (ICCWAMTIP), 2020, : 33 - 38
  • [6] A Recurrent Neural Network based Generative Adversarial Network for Long Multivariate Time Series Forecasting
    Tang, Peiwang
    Zhang, Qinghua
    Zhang, Xianchao
    PROCEEDINGS OF THE 2023 ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA RETRIEVAL, ICMR 2023, 2023, : 181 - 189
  • [7] Time Series Prediction Method Based on Variant LSTM Recurrent Neural Network
    Hu, Jiaojiao
    Wang, Xiaofeng
    Zhang, Ying
    Zhang, Depeng
    Zhang, Meng
    Xue, Jianru
    NEURAL PROCESSING LETTERS, 2020, 52 (02) : 1485 - 1500
  • [8] Time-series Prediction Based on VMD and Stack Recurrent Neural Network
    Jiang, Tao
    Han, Min
    Wang, Jun
    2020 12TH INTERNATIONAL CONFERENCE ON ADVANCED COMPUTATIONAL INTELLIGENCE (ICACI), 2020, : 522 - 528
  • [9] Applying a Recurrent Neural Network-Based Deep Learning Model for Gene Expression Data Classification
    Babichev, Sergii
    Liakh, Igor
    Kalinina, Irina
    APPLIED SCIENCES-BASEL, 2023, 13 (21):
  • [10] Time Series Prediction Method Based on Variant LSTM Recurrent Neural Network
    Jiaojiao Hu
    Xiaofeng Wang
    Ying Zhang
    Depeng Zhang
    Meng Zhang
    Jianru Xue
    Neural Processing Letters, 2020, 52 : 1485 - 1500