Deep Refinement: capsule network with attention mechanism-based system for text classification

被引:0
作者
Deepak Kumar Jain
Rachna Jain
Yash Upadhyay
Abhishek Kathuria
Xiangyuan Lan
机构
[1] Chongqing University of Posts and Telecommunications,Key Laboratory of Intelligent Air
[2] Bharati Vidyapeeth’s College of Engineering,Ground Cooperative Control for Universities in Chongqing, College of Automation
[3] Hong Kong Baptist University,Department of Computer Science and Engineering
来源
Neural Computing and Applications | 2020年 / 32卷
关键词
Text classification; Capsule; Attention; LSTM; GRU; Neural network; NLP;
D O I
暂无
中图分类号
学科分类号
摘要
Most of the text in the questions of community question–answering systems does not consist of a definite mechanism for the restriction of inappropriate and insincere content. A given piece of text can be insincere if it asserts false claims or assumes something which is debatable or has a non-neutral or exaggerated tone about an individual or a group. In this paper, we propose a pipeline called Deep Refinement which utilizes some of the state-of-the-art methods for information retrieval from highly sparse data such as capsule network and attention mechanism. We have applied the Deep Refinement pipeline to classify the text primarily into two categories, namely sincere and insincere. Our novel approach ‘Deep Refinement’ provides a system for the classification of such questions in order to ensure enhanced monitoring and information quality. The database used to understand the real concept of what actually makes up sincere and insincere includes quora insincere question dataset. Our proposed question classification method outperformed previously used text classification methods, as evident from the F1 score of 0.978.
引用
收藏
页码:1839 / 1856
页数:17
相关论文
共 50 条
  • [1] Deep Refinement: capsule network with attention mechanism-based system for text classification
    Jain, Deepak Kumar
    Jain, Rachna
    Upadhyay, Yash
    Kathuria, Abhishek
    Lan, Xiangyuan
    NEURAL COMPUTING & APPLICATIONS, 2020, 32 (07) : 1839 - 1856
  • [2] Recurrent Attention Capsule Network for Text Classification
    Guan, Huanmei
    Liu, Jun
    Wu, Yujia
    Li, Ni
    2019 6TH INTERNATIONAL CONFERENCE ON INFORMATION SCIENCE AND CONTROL ENGINEERING (ICISCE 2019), 2019, : 444 - 448
  • [3] Research on a Capsule Network Text Classification Method with a Self-Attention Mechanism
    Yu, Xiaodong
    Luo, Shun-Nain
    Wu, Yujia
    Cai, Zhufei
    Kuan, Ta-Wen
    Tseng, Shih-Pang
    SYMMETRY-BASEL, 2024, 16 (05):
  • [4] A Neural Network Based Text Classification with Attention Mechanism
    Lu SiChen
    PROCEEDINGS OF 2019 IEEE 7TH INTERNATIONAL CONFERENCE ON COMPUTER SCIENCE AND NETWORK TECHNOLOGY (ICCSNT 2019), 2019, : 333 - 338
  • [5] BiGRU attention capsule neural network for persian text classification
    Kenarang, Amir
    Farahani, Mehrdad
    Manthouri, Mohammad
    JOURNAL OF AMBIENT INTELLIGENCE AND HUMANIZED COMPUTING, 2022, 13 (8) : 3923 - 3933
  • [6] BiGRU attention capsule neural network for persian text classification
    Amir Kenarang
    Mehrdad Farahani
    Mohammad Manthouri
    Journal of Ambient Intelligence and Humanized Computing, 2022, 13 : 3923 - 3933
  • [7] Attention-Based Deep Convolutional Capsule Network for Hyperspectral Image Classification
    Zhang, Xiaoxia
    Zhang, Xia
    IEEE ACCESS, 2024, 12 : 56815 - 56823
  • [8] Attention Mechanism-Based Deep Supervision Network for Abdominal Multi-organ Segmentation
    An, Peng
    Xu, Yurou
    Wu, Panpan
    FAST, LOW-RESOURCE, AND ACCURATE ORGAN AND PAN-CANCER SEGMENTATION IN ABDOMEN CT, FLARE 2023, 2024, 14544 : 319 - 332
  • [9] Capsule Network-Based Text Sentiment Classification
    Chen, Bingyang
    Xu, Zhidong
    Wang, Xiao
    Xu, Long
    Zhang, Weishan
    IFAC PAPERSONLINE, 2020, 53 (05): : 698 - 703
  • [10] Optimizing Automatic Text Classification Approach in Adaptive Online Collaborative Discussion–A Perspective of Attention Mechanism-Based Bi-LSTM
    Zheng, Yafeng
    Gao, Zhanghao
    Shen, Jun
    Zhai, Xuesong
    IEEE TRANSACTIONS ON LEARNING TECHNOLOGIES, 2023, 16 (05): : 591 - 602