CNO-LSTM: A Chaotic Neural Oscillatory Long Short-Term Memory Model for Text Classification

被引:7
|
作者
Shi, Nuobei [1 ]
Chen, Zhuohui [2 ]
Chen, Ling [3 ]
Lee, Raymond S. T. [1 ]
机构
[1] Beijing Normal Univ, Hong Kong Baptist Univ United Int Coll, Fac Sci & Technol, Zhuhai 519000, Peoples R China
[2] Macau Univ Sci & Technol, Fac Innovat Engn, Taipa, Macau, Peoples R China
[3] Beijing Inst Technol Zhuhai, Sch Appl Sci & Civil Engn, Zhuhai 519000, Peoples R China
关键词
Neurons; Long short-term memory; chaotic neural network; Lee-oscillator; text classification; natural language processing; AUTOASSOCIATIVE NETWORK; SENTIMENT ANALYSIS; SOCIAL MEDIA;
D O I
10.1109/ACCESS.2022.3228600
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Long Short-Term Memory (LSTM) networks are unique to exercise data in its memory cell with long-term memory as Natural Language Processing (NLP) tasks have inklings of intensive time and computational power due to their complex structures like magnitude language model Transformer required to pre-train and learn billions of data performing different NLP tasks. In this paper, a dynamic chaotic model is proposed for the objective of transforming neurons states in network with neural dynamic characteristics by restructuring LSTM as Chaotic Neural Oscillatory-Long-Short Term Memory (CNO-LSTM), where neurons in LSTM memory cells are weighed in substitutes by oscillatory neurons to speed up computational training of language model and improve text classification accuracy for real-world applications. From the implementation perspective, five popular datasets of general text classification including binary, multi classification and multi-label classification are used to compare with mainstream baseline models on NLP tasks. Results showed that the performance of CNO-LSTM, a simplified model structure and oscillatory neurons state in exercising different types of text classification tasks are above baseline models in terms of evaluation index such as Accuracy, Precision, Recall and F1. The main contributions are time reduction and improved accuracy. It achieved approximately 46.76% of the highest reduction training time and 2.55% accuracy compared with vanilla LSTM model. Further, it achieved approximately 35.86% in time reduction compared with attention model without oscillatory indicating that the model restructure has reduced GPU dependency to improve training accuracy.
引用
收藏
页码:129564 / 129579
页数:16
相关论文
共 50 条
  • [31] Skin cancer classification based on a hybrid deep model and long short-term memory
    Mavaddati, Samira
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2025, 100
  • [32] A method of automatic text summarisation based on long short-term memory
    Fang, Wei
    Jiang, TianXiao
    Jiang, Ke
    Zhang, Feihong
    Ding, Yewen
    Sheng, Jack
    INTERNATIONAL JOURNAL OF COMPUTATIONAL SCIENCE AND ENGINEERING, 2020, 22 (01) : 39 - 49
  • [33] Long Short-Term Memory Model Based Microaneurysm Sequence Classification in Fundus Images
    Acharya, Renuka
    Puhan, Niladri B.
    2022 IEEE INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING AND COMMUNICATIONS, SPCOM, 2022,
  • [34] Emotion detection in text using nested Long Short-Term Memory
    Haryadi D.
    Kusuma G.P.
    International Journal of Advanced Computer Science and Applications, 2019, 10 (06): : 351 - 357
  • [35] Reconstruction of missing groundwater level data by using Long Short-Term Memory (LSTM) deep neural network
    Vu, M. T.
    Jardani, A.
    Massei, N.
    Fournier, M.
    JOURNAL OF HYDROLOGY, 2021, 597
  • [36] Using a long short-term memory recurrent neural network (LSTM-RNN) to classify network attacks
    Muhuri P.S.
    Chatterjee P.
    Yuan X.
    Roy K.
    Esterline A.
    Information (Switzerland), 2020, 11 (05):
  • [37] Using a Long Short-Term Memory Recurrent Neural Network (LSTM-RNN) to Classify Network Attacks
    Muhuri, Pramita Sree
    Chatterjee, Prosenjit
    Yuan, Xiaohong
    Roy, Kaushik
    Esterline, Albert
    INFORMATION, 2020, 11 (05)
  • [38] Emotion Detection in Text using Nested Long Short-Term Memory
    Haryadi, Daniel
    Kusuma, Gede Putra
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2019, 10 (06) : 351 - 357
  • [39] A model for vessel trajectory prediction based on long short-term memory neural network
    Tang H.
    Yin Y.
    Shen H.
    Journal of Marine Engineering and Technology, 2022, 21 (03) : 136 - 145
  • [40] Arabic Language Opinion Mining Based on Long Short-Term Memory (LSTM)
    Setyanto, Arief
    Laksito, Arif
    Alarfaj, Fawaz
    Alreshoodi, Mohammed
    Kusrini
    Oyong, Irwan
    Hayaty, Mardhiya
    Alomair, Abdullah
    Almusallam, Naif
    Kurniasari, Lilis
    APPLIED SCIENCES-BASEL, 2022, 12 (09):