CNO-LSTM: A Chaotic Neural Oscillatory Long Short-Term Memory Model for Text Classification

被引:7
|
作者
Shi, Nuobei [1 ]
Chen, Zhuohui [2 ]
Chen, Ling [3 ]
Lee, Raymond S. T. [1 ]
机构
[1] Beijing Normal Univ, Hong Kong Baptist Univ United Int Coll, Fac Sci & Technol, Zhuhai 519000, Peoples R China
[2] Macau Univ Sci & Technol, Fac Innovat Engn, Taipa, Macau, Peoples R China
[3] Beijing Inst Technol Zhuhai, Sch Appl Sci & Civil Engn, Zhuhai 519000, Peoples R China
关键词
Neurons; Long short-term memory; chaotic neural network; Lee-oscillator; text classification; natural language processing; AUTOASSOCIATIVE NETWORK; SENTIMENT ANALYSIS; SOCIAL MEDIA;
D O I
10.1109/ACCESS.2022.3228600
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Long Short-Term Memory (LSTM) networks are unique to exercise data in its memory cell with long-term memory as Natural Language Processing (NLP) tasks have inklings of intensive time and computational power due to their complex structures like magnitude language model Transformer required to pre-train and learn billions of data performing different NLP tasks. In this paper, a dynamic chaotic model is proposed for the objective of transforming neurons states in network with neural dynamic characteristics by restructuring LSTM as Chaotic Neural Oscillatory-Long-Short Term Memory (CNO-LSTM), where neurons in LSTM memory cells are weighed in substitutes by oscillatory neurons to speed up computational training of language model and improve text classification accuracy for real-world applications. From the implementation perspective, five popular datasets of general text classification including binary, multi classification and multi-label classification are used to compare with mainstream baseline models on NLP tasks. Results showed that the performance of CNO-LSTM, a simplified model structure and oscillatory neurons state in exercising different types of text classification tasks are above baseline models in terms of evaluation index such as Accuracy, Precision, Recall and F1. The main contributions are time reduction and improved accuracy. It achieved approximately 46.76% of the highest reduction training time and 2.55% accuracy compared with vanilla LSTM model. Further, it achieved approximately 35.86% in time reduction compared with attention model without oscillatory indicating that the model restructure has reduced GPU dependency to improve training accuracy.
引用
收藏
页码:129564 / 129579
页数:16
相关论文
共 50 条
  • [1] Text Classification Using Long Short-Term Memory
    Sari, Winda Kurnia
    Rini, Dian Palupi
    Malik, Reza Firsandaya
    2019 3RD INTERNATIONAL CONFERENCE ON ELECTRICAL ENGINEERING AND COMPUTER SCIENCE (ICECOS 2019), 2019, : 150 - 155
  • [2] A text classification method based on a convolutional and bidirectional long short-term memory model
    Huan, Hai
    Guo, Zelin
    Cai, Tingting
    He, Zichen
    CONNECTION SCIENCE, 2022, 34 (01) : 2108 - 2124
  • [3] A Hybrid Model Based on Convolutional Neural Network and Long Short-Term Memory for Multi-label Text Classification
    Hamed Khataei Maragheh
    Farhad Soleimanian Gharehchopogh
    Kambiz Majidzadeh
    Amin Babazadeh Sangar
    Neural Processing Letters, 56
  • [4] A Hybrid Model Based on Convolutional Neural Network and Long Short-Term Memory for Multi-label Text Classification
    Maragheh, Hamed Khataei
    Gharehchopogh, Farhad Soleimanian
    Majidzadeh, Kambiz
    Sangar, Amin Babazadeh
    NEURAL PROCESSING LETTERS, 2024, 56 (02)
  • [5] A review on the long short-term memory model
    Van Houdt, Greg
    Mosquera, Carlos
    Napoles, Gonzalo
    ARTIFICIAL INTELLIGENCE REVIEW, 2020, 53 (08) : 5929 - 5955
  • [6] Well performance prediction based on Long Short-Term Memory (LSTM) neural network
    Huang, Ruijie
    Wei, Chenji
    Wang, Baohua
    Yang, Jian
    Xu, Xin
    Wu, Suwei
    Huang, Suqi
    JOURNAL OF PETROLEUM SCIENCE AND ENGINEERING, 2022, 208
  • [7] LSTM-SNP: A long short-term memory model inspired from spiking neural P systems
    Liu, Qian
    Long, Lifan
    Yang, Qian
    Peng, Hong
    Wang, Jun
    Luo, Xiaohui
    KNOWLEDGE-BASED SYSTEMS, 2022, 235
  • [8] Feature selection based on long short term memory for text classification
    Ming Hong
    Heyong Wang
    Multimedia Tools and Applications, 2024, 83 : 44333 - 44378
  • [9] Feature selection based on long short term memory for text classification
    Hong, Ming
    Wang, Heyong
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 83 (15) : 44333 - 44378
  • [10] Comparing Long Short-Term Memory (LSTM) and bidirectional LSTM deep neural networks for power consumption prediction
    da Silva, Davi Guimaraes
    Meneses, Anderson Alvarenga de Moura
    ENERGY REPORTS, 2023, 10 : 3315 - 3334