Research on News Text Classification Based on BERT-BiLSTM-TextCNN-Attention

被引:0
|
作者
Wang, Jia [1 ]
Li, Zongting [2 ]
Ma, Chenyang [2 ]
机构
[1] Dalian Polytech Univ, Dalian 116034, Liaoning, Peoples R China
[2] Dalian Polytech Univ, Sch Informat Sci & Engn, Dalian 116034, Liaoning, Peoples R China
来源
PROCEEDINGS OF 2024 3RD INTERNATIONAL CONFERENCE ON CYBER SECURITY, ARTIFICIAL INTELLIGENCE AND DIGITAL ECONOMY, CSAIDE 2024 | 2024年
关键词
Deep learning; text classification; natural language processing; neural network;
D O I
10.1145/3672919.3672973
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Traditional machine learning models are difficult to capture complex features and contextual relationships. While a singular deep learning architecture surpasses machine learning in text processing, it falls short of encompassing the entirety of textual information [5]. Enter a novel approach: a news text classifier built upon BERT-BiLSTM-TextCNN-Attention. This model employs BERT's pre-trained language models to delve into text content. It then channels this data into a BiLSTM layer, capturing sequence nuances and long-term dependencies for comprehensive semantic insight. Following this, the output moves through a TextCNN layer, effectively capturing local semantic cues through convolution. The model culminates with attention mechanisms that highlight pivotal text attributes, refining feature vectors for the Softmax layer's classification. The experimentation utilized a subset of the THUCNews Chinese news text dataset. Results indicate that the BERT BiLSTM TextCNN Attention model achieved 96.48% accuracy, outperforming other benchmarks. This underscores its superiority in handling Chinese news text classification and validating its prowess in extracting deep semantic nuances and crucial local features from the text.
引用
收藏
页码:295 / 298
页数:4
相关论文
共 50 条
  • [41] Short Text Classification Model Based on Multi-Attention
    Liu, Yunxiang
    Xu, Qi
    2020 13TH INTERNATIONAL SYMPOSIUM ON COMPUTATIONAL INTELLIGENCE AND DESIGN (ISCID 2020), 2020, : 225 - 229
  • [42] Research on Chinese Keyword Recognition Based on BERT Binary Classification Algorithm
    Zhu, Chunling
    Wu, Di
    PROCEEDINGS OF 2024 INTERNATIONAL CONFERENCE ON MACHINE INTELLIGENCE AND DIGITAL APPLICATIONS, MIDA2024, 2024, : 689 - 695
  • [43] A text classification method based on LSTM and graph attention network
    Wang, Haitao
    Li, Fangbing
    CONNECTION SCIENCE, 2022, 34 (01) : 2466 - 2480
  • [44] Fake News Classification using transformer based enhanced LSTM and BERT
    Rai N.
    Kumar D.
    Kaushik N.
    Raj C.
    Ali A.
    International Journal of Cognitive Computing in Engineering, 2022, 3 : 98 - 105
  • [45] Research on entity recognition and alignment of APT attack based on Bert and BiLSTM-CRF
    Yang, Xiuzhang
    Peng, Guojun
    Li, Zichuan
    Lyu, Yangqi
    Liu, Side
    Li, Chenguang
    Tongxin Xuebao/Journal on Communications, 2022, 43 (06): : 58 - 70
  • [46] BiLSTM and Attention-Based Modulation Classification of Realistic Wireless Signals
    Udaiwal, Rohit
    Baishya, Nayan
    Gupta, Yash
    Manoj, B. R.
    2024 INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING AND COMMUNICATIONS, SPCOM 2024, 2024,
  • [47] Text classification of Chinese news based on multi-scale CNN and LSTM hybrid model
    Zhai, ZhengLi
    Zhang, Xin
    Fang, FeiFei
    Yao, LuYao
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 82 (14) : 20975 - 20988
  • [48] Text classification of Chinese news based on multi-scale CNN and LSTM hybrid model
    ZhengLi Zhai
    Xin Zhang
    FeiFei Fang
    LuYao Yao
    Multimedia Tools and Applications, 2023, 82 : 20975 - 20988
  • [49] News Text Classification Based on MLCNN and BiGRU Hybrid Neural Network
    Duan, Jiajia
    Zhao, Hui
    Qin, Wenshuai
    Qiu, Meikang
    Liu, Meiqin
    2020 3RD INTERNATIONAL CONFERENCE ON SMART BLOCKCHAIN (SMARTBLOCK), 2020, : 137 - 142
  • [50] Text Matching in Insurance Question-Answering Community Based on an Integrated BiLSTM-TextCNN Model Fusing Multi-Feature
    Li, Zhaohui
    Yang, Xueru
    Zhou, Luli
    Jia, Hongyu
    Li, Wenli
    ENTROPY, 2023, 25 (04)