CNO-LSTM: A Chaotic Neural Oscillatory Long Short-Term Memory Model for Text Classification

被引:7
|
作者
Shi, Nuobei [1 ]
Chen, Zhuohui [2 ]
Chen, Ling [3 ]
Lee, Raymond S. T. [1 ]
机构
[1] Beijing Normal Univ, Hong Kong Baptist Univ United Int Coll, Fac Sci & Technol, Zhuhai 519000, Peoples R China
[2] Macau Univ Sci & Technol, Fac Innovat Engn, Taipa, Macau, Peoples R China
[3] Beijing Inst Technol Zhuhai, Sch Appl Sci & Civil Engn, Zhuhai 519000, Peoples R China
关键词
Neurons; Long short-term memory; chaotic neural network; Lee-oscillator; text classification; natural language processing; AUTOASSOCIATIVE NETWORK; SENTIMENT ANALYSIS; SOCIAL MEDIA;
D O I
10.1109/ACCESS.2022.3228600
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Long Short-Term Memory (LSTM) networks are unique to exercise data in its memory cell with long-term memory as Natural Language Processing (NLP) tasks have inklings of intensive time and computational power due to their complex structures like magnitude language model Transformer required to pre-train and learn billions of data performing different NLP tasks. In this paper, a dynamic chaotic model is proposed for the objective of transforming neurons states in network with neural dynamic characteristics by restructuring LSTM as Chaotic Neural Oscillatory-Long-Short Term Memory (CNO-LSTM), where neurons in LSTM memory cells are weighed in substitutes by oscillatory neurons to speed up computational training of language model and improve text classification accuracy for real-world applications. From the implementation perspective, five popular datasets of general text classification including binary, multi classification and multi-label classification are used to compare with mainstream baseline models on NLP tasks. Results showed that the performance of CNO-LSTM, a simplified model structure and oscillatory neurons state in exercising different types of text classification tasks are above baseline models in terms of evaluation index such as Accuracy, Precision, Recall and F1. The main contributions are time reduction and improved accuracy. It achieved approximately 46.76% of the highest reduction training time and 2.55% accuracy compared with vanilla LSTM model. Further, it achieved approximately 35.86% in time reduction compared with attention model without oscillatory indicating that the model restructure has reduced GPU dependency to improve training accuracy.
引用
收藏
页码:129564 / 129579
页数:16
相关论文
共 50 条
  • [41] A forecasting model for wave heights based on a long short-term memory neural network
    Song Gao
    Juan Huang
    Yaru Li
    Guiyan Liu
    Fan Bi
    Zhipeng Bai
    Acta Oceanologica Sinica, 2021, 40 : 62 - 69
  • [42] On extended long short-term memory and dependent bidirectional recurrent neural network
    Su, Yuanhang
    Kuo, C-C Jay
    NEUROCOMPUTING, 2019, 356 : 151 - 161
  • [43] A forecasting model for wave heights based on a long short-term memory neural network
    Gao, Song
    Huang, Juan
    Li, Yaru
    Liu, Guiyan
    Bi, Fan
    Bai, Zhipeng
    ACTA OCEANOLOGICA SINICA, 2021, 40 (01) : 62 - 69
  • [44] Using Long Short-Term Memory (LSTM) recurrent neural networks to classify unprocessed EEG for seizure prediction
    Chambers, Jordan D.
    Cook, Mark J.
    Burkitt, Anthony N.
    Grayden, David B.
    FRONTIERS IN NEUROSCIENCE, 2024, 18
  • [45] Multi-Stream Long Short-Term Memory Neural Network Language Model
    Arisoy, Ebru
    Saraclar, Murat
    16TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2015), VOLS 1-5, 2015, : 1413 - 1417
  • [46] Enhanced Long Short-Term Memory (ELSTM) Model for Sentiment Analysis
    Tiwari, Dimple
    Nagpal, Bharti
    INTERNATIONAL ARAB JOURNAL OF INFORMATION TECHNOLOGY, 2021, 18 (06) : 846 - 855
  • [47] LSTM Recurrent Neural Networks for Short Text and Sentiment Classification
    Nowak, Jakub
    Taspinar, Ahmet
    Scherer, Rafal
    ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING, ICAISC 2017, PT II, 2017, 10246 : 553 - 562
  • [48] UFS-LSTM: unsupervised feature selection with long short-term memory network for remote sensing scene classification
    Kumar, Sandeep
    Setty, Suresh Lakshmi Narasimha
    EVOLUTIONARY INTELLIGENCE, 2023, 16 (01) : 299 - 315
  • [49] UFS-LSTM: unsupervised feature selection with long short-term memory network for remote sensing scene classification
    Sandeep Kumar
    Suresh Lakshmi Narasimha Setty
    Evolutionary Intelligence, 2023, 16 : 299 - 315
  • [50] A Deep Neural Network Approach using Convolutional Network and Long Short Term Memory for Text Sentiment Classification
    Shoryu, Teragawa
    Wang, Lei
    Ma, Ruixin
    PROCEEDINGS OF THE 2021 IEEE 24TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN (CSCWD), 2021, : 763 - 768