Research on News Text Classification Based on BERT-BiLSTM-TextCNN-Attention

被引:0
|
作者
Wang, Jia [1 ]
Li, Zongting [2 ]
Ma, Chenyang [2 ]
机构
[1] Dalian Polytech Univ, Dalian 116034, Liaoning, Peoples R China
[2] Dalian Polytech Univ, Sch Informat Sci & Engn, Dalian 116034, Liaoning, Peoples R China
来源
PROCEEDINGS OF 2024 3RD INTERNATIONAL CONFERENCE ON CYBER SECURITY, ARTIFICIAL INTELLIGENCE AND DIGITAL ECONOMY, CSAIDE 2024 | 2024年
关键词
Deep learning; text classification; natural language processing; neural network;
D O I
10.1145/3672919.3672973
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Traditional machine learning models are difficult to capture complex features and contextual relationships. While a singular deep learning architecture surpasses machine learning in text processing, it falls short of encompassing the entirety of textual information [5]. Enter a novel approach: a news text classifier built upon BERT-BiLSTM-TextCNN-Attention. This model employs BERT's pre-trained language models to delve into text content. It then channels this data into a BiLSTM layer, capturing sequence nuances and long-term dependencies for comprehensive semantic insight. Following this, the output moves through a TextCNN layer, effectively capturing local semantic cues through convolution. The model culminates with attention mechanisms that highlight pivotal text attributes, refining feature vectors for the Softmax layer's classification. The experimentation utilized a subset of the THUCNews Chinese news text dataset. Results indicate that the BERT BiLSTM TextCNN Attention model achieved 96.48% accuracy, outperforming other benchmarks. This underscores its superiority in handling Chinese news text classification and validating its prowess in extracting deep semantic nuances and crucial local features from the text.
引用
收藏
页码:295 / 298
页数:4
相关论文
共 50 条
  • [31] Fake news detection and classification using hybrid BiLSTM and self-attention model
    Mohapatra, Asutosh
    Thota, Nithin
    Prakasam, P.
    MULTIMEDIA TOOLS AND APPLICATIONS, 2022, 81 (13) : 18503 - 18519
  • [32] Fake news detection and classification using hybrid BiLSTM and self-attention model
    Asutosh Mohapatra
    Nithin Thota
    P. Prakasam
    Multimedia Tools and Applications, 2022, 81 : 18503 - 18519
  • [33] Improving BERT-Based Text Classification With Auxiliary Sentence and Domain Knowledge
    Yu, Shanshan
    Su, Jindian
    Luo, Da
    IEEE ACCESS, 2019, 7 : 176600 - 176612
  • [34] A Neural Network Based Text Classification with Attention Mechanism
    Lu SiChen
    PROCEEDINGS OF 2019 IEEE 7TH INTERNATIONAL CONFERENCE ON COMPUTER SCIENCE AND NETWORK TECHNOLOGY (ICCSNT 2019), 2019, : 333 - 338
  • [35] The Study on the Text Classification Based on Graph Convolutional Network and BiLSTM
    Xue, Bingxin
    Zhu, Cui
    Wang, Xuan
    Zhu, Wenjun
    APPLIED SCIENCES-BASEL, 2022, 12 (16):
  • [36] Text Classification Research with Attention-based Recurrent Neural Networks
    Du, C.
    Huang, L.
    INTERNATIONAL JOURNAL OF COMPUTERS COMMUNICATIONS & CONTROL, 2018, 13 (01) : 50 - 61
  • [37] Chinese Text Classification Method Based on BERT Word Embedding
    Wang, Ziniu
    Huang, Zhilin
    Gao, Jianling
    2020 5TH INTERNATIONAL CONFERENCE ON MATHEMATICS AND ARTIFICIAL INTELLIGENCE (ICMAI 2020), 2020, : 66 - 71
  • [38] The Automatic Text Classification Method Based on BERT and Feature Union
    Li, Wenting
    Gao, Shangbing
    Zhou, Hong
    Huang, Zihe
    Zhang, Kewen
    Li, Wei
    2019 IEEE 25TH INTERNATIONAL CONFERENCE ON PARALLEL AND DISTRIBUTED SYSTEMS (ICPADS), 2019, : 774 - 777
  • [39] Cross-Domain Text Classification Based on BERT Model
    Zhang, Kuan
    Hei, Xinhong
    Fei, Rong
    Guo, Yufan
    Jiao, Rui
    DATABASE SYSTEMS FOR ADVANCED APPLICATIONS: DASFAA 2021 INTERNATIONAL WORKSHOPS, 2021, 12680 : 197 - 208
  • [40] Named Entities Based on the BERT-BILSTM-ACRF Model Recognition Research
    Wang, Jingdong
    Guo, Yongjia
    PROCEEDINGS OF 2023 7TH INTERNATIONAL CONFERENCE ON NATURAL LANGUAGE PROCESSING AND INFORMATION RETRIEVAL, NLPIR 2023, 2023, : 228 - 233