Deep Refinement: capsule network with attention mechanism-based system for text classification

被引:0
作者
Deepak Kumar Jain
Rachna Jain
Yash Upadhyay
Abhishek Kathuria
Xiangyuan Lan
机构
[1] Chongqing University of Posts and Telecommunications,Key Laboratory of Intelligent Air
[2] Bharati Vidyapeeth’s College of Engineering,Ground Cooperative Control for Universities in Chongqing, College of Automation
[3] Hong Kong Baptist University,Department of Computer Science and Engineering
来源
Neural Computing and Applications | 2020年 / 32卷
关键词
Text classification; Capsule; Attention; LSTM; GRU; Neural network; NLP;
D O I
暂无
中图分类号
学科分类号
摘要
Most of the text in the questions of community question–answering systems does not consist of a definite mechanism for the restriction of inappropriate and insincere content. A given piece of text can be insincere if it asserts false claims or assumes something which is debatable or has a non-neutral or exaggerated tone about an individual or a group. In this paper, we propose a pipeline called Deep Refinement which utilizes some of the state-of-the-art methods for information retrieval from highly sparse data such as capsule network and attention mechanism. We have applied the Deep Refinement pipeline to classify the text primarily into two categories, namely sincere and insincere. Our novel approach ‘Deep Refinement’ provides a system for the classification of such questions in order to ensure enhanced monitoring and information quality. The database used to understand the real concept of what actually makes up sincere and insincere includes quora insincere question dataset. Our proposed question classification method outperformed previously used text classification methods, as evident from the F1 score of 0.978.
引用
收藏
页码:1839 / 1856
页数:17
相关论文
共 50 条
  • [21] Multi-applicable text classification based on deep neural network
    Yang, Jingjing
    Deng, Feng
    Lv, Suhuan
    Wang, Rui
    Guo, Qi
    Kou, Zongchun
    Chen, Shiqiang
    INTERNATIONAL JOURNAL OF SENSOR NETWORKS, 2022, 40 (04) : 277 - 286
  • [22] An attention mechanism-based LSTM network for cancer kinase activity prediction
    Danishuddin
    Kumar, V.
    Lee, G.
    Yoo, J.
    Ro, H. S.
    Lee, K. W.
    SAR AND QSAR IN ENVIRONMENTAL RESEARCH, 2022, 33 (08) : 631 - 647
  • [23] Text Classification with Attention Gated Graph Neural Network
    Zhaoyang Deng
    Chenxiang Sun
    Guoqiang Zhong
    Yuxu Mao
    Cognitive Computation, 2022, 14 : 1464 - 1473
  • [24] Text Classification with Attention Gated Graph Neural Network
    Deng, Zhaoyang
    Sun, Chenxiang
    Zhong, Guoqiang
    Mao, Yuxu
    COGNITIVE COMPUTATION, 2022, 14 (04) : 1464 - 1473
  • [25] Text Classification Based on Convolutional Neural Network and Attention Model
    Yang, Shuang
    Tang, Yan
    2020 3RD INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND BIG DATA (ICAIBD 2020), 2020, : 67 - 73
  • [26] Short Text Classification Model Based on Multi-Attention
    Liu, Yunxiang
    Xu, Qi
    2020 13TH INTERNATIONAL SYMPOSIUM ON COMPUTATIONAL INTELLIGENCE AND DESIGN (ISCID 2020), 2020, : 225 - 229
  • [27] Biomedical Text Classification Method Based on Hypergraph Attention Network
    Simeng B.
    Zhendong N.
    Hui H.
    Kaize S.
    Kun Y.
    Yuanchi M.
    Data Analysis and Knowledge Discovery, 2022, 6 (11): : 13 - 24
  • [28] Attention Mechanism-Based Glaucoma Classification Model Using Retinal Fundus Images
    Cho, You-Sang
    Song, Ho-Jung
    Han, Ju-Hyuck
    Kim, Yong-Suk
    SENSORS, 2024, 24 (14)
  • [29] Attention enhanced capsule network for text classification by encoding syntactic dependency trees with graph convolutional neural network
    Jia X.
    Wang L.
    PeerJ Computer Science, 2022, 7
  • [30] Attention enhanced capsule network for text classification by encoding syntactic dependency trees with graph convolutional neural network
    Jia, Xudong
    Wang, Li
    PEERJ COMPUTER SCIENCE, 2022, 8