Short Text Classification Model Based on Multi-Attention

被引:0
|
作者
Liu, Yunxiang [1 ]
Xu, Qi [1 ]
机构
[1] Shanghai Inst Technol, Sch Comp Sci & Infounat Engn, Shanghai 201418, Peoples R China
关键词
Deep Learning; NLP; Attention; Text Classification;
D O I
10.1109/ISCID51228.2020.00057
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Short text classification plays an important role in NLP and its applications span a wide range of activities such as sentiment analysis, spam detection. Recently, attention mechanism is widely used in text classification task. Inspired by this, a text classification model based on multi-attention network(MAN) is proposed in this study, which perform well in extracting information related to text category. In our model, we combine the textual information based on multi-attention mechanism, which enables model to focus on global information of the sentence. We tested effectiveness of our model using several standard text classification datasets. Experiment told that our model achieved state-of-the-art results on all datasets.
引用
收藏
页码:225 / 229
页数:5
相关论文
共 50 条
  • [31] Residual networks with multi-attention mechanism for hyperspectral image classification
    Shao Y.
    Lan J.
    Liang Y.
    Hu J.
    Arabian Journal of Geosciences, 2021, 14 (4)
  • [32] Multi-Attention Ghost Residual Fusion Network for Image Classification
    Jia, Xiaofen
    Du, Shengjie
    Guo, Yongcun
    Huang, Yourui
    Zhao, Baiting
    IEEE ACCESS, 2021, 9 : 81421 - 81431
  • [33] Question-Answering Aspect Classification with Multi-attention Representation
    Wu, Hanqian
    Liu, Mumu
    Wang, Jingjing
    Xie, Jue
    Li, Shoushan
    INFORMATION RETRIEVAL, CCIR 2018, 2018, 11168 : 78 - 89
  • [34] Hyperspectral Image Classification Based on Multi-attention Mechanism and Compiled Graph Neural Networks
    Jie S.
    Jing Y.
    Shujie D.
    Shaobo L.
    Jianjun H.
    Nongye Jixie Xuebao/Transactions of the Chinese Society for Agricultural Machinery, 2024, 55 (03): : 183 - 192and212
  • [35] Chinese Short Text Classification with Hybrid Features and Multi-Head Attention
    Jiang, Jielin
    Zhu, Yongwei
    Xu, Xiaolong
    Cui, Yan
    Zhao, Yingnan
    Computer Engineering and Applications, 60 (09): : 237 - 243
  • [36] A Multi-Attention Autoencoder for Hyperspectral Unmixing Based on the Extended Linear Mixing Model
    Su, Lijuan
    Liu, Jun
    Yuan, Yan
    Chen, Qiyue
    REMOTE SENSING, 2023, 15 (11)
  • [37] Comparative Convolutional Dynamic Multi-Attention Recommendation Model
    Ni, Juan
    Huang, Zhenhua
    Yu, Chang
    Lv, Dongdong
    Wang, Cheng
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (08) : 3510 - 3521
  • [38] ID-insensitive deepfake detection model based on multi-attention mechanism
    Sheng, Yuncan
    Zou, Zhengrui
    Yu, Zongxuan
    Pang, Mengxue
    Ou, Wei
    Han, Wenbao
    SCIENTIFIC REPORTS, 2025, 15 (01):
  • [39] Multi-Label Text Classification model integrating Label Attention and Historical Attention
    Sun, Guoying
    Cheng, Yanan
    Dong, Fangzhou
    Wang, Luhua
    Zhao, Dong
    Zhang, Zhaoxin
    Tong, Xiaojun
    KNOWLEDGE-BASED SYSTEMS, 2024, 296
  • [40] Video Captioning using Hierarchical Multi-Attention Model
    Xiao, Huanhou
    Shi, Jinglun
    ICAIP 2018: 2018 THE 2ND INTERNATIONAL CONFERENCE ON ADVANCES IN IMAGE PROCESSING, 2018, : 96 - 101