Combining Contextual Information by Self-attention Mechanism in Convolutional Neural Networks for Text Classification

被引:14
|
作者
Wu, Xin [1 ]
Cai, Yi [1 ]
Li, Qing [2 ]
Xu, Jingyun [1 ]
Leung, Ho-fung [3 ]
机构
[1] South China Univ Technol, Sch Software Engn, Guangzhou, Peoples R China
[2] City Univ Hong Kong, Dept Comp Sci, Kowloon, Hong Kong, Peoples R China
[3] Chinese Univ Hong Kong, Dept Comp Sci & Engn, Kowloon, Hong Kong, Peoples R China
来源
WEB INFORMATION SYSTEMS ENGINEERING, WISE 2018, PT I | 2018年 / 11233卷
基金
对外科技合作项目(国际科技项目);
关键词
Convolutional neural networks; Text classification; Attention mechanism; Word representation;
D O I
10.1007/978-3-030-02922-7_31
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Convolutional neural networks (CNN) are widely used in many NLP tasks, which can employ convolutional filters to capture useful semantic features of texts. However, convolutional filters with small window size may lose global context information of texts, simply increasing window size will bring the problems of data sparsity and enormous parameters. To capture global context information, we propose to use the self-attention mechanism to obtain contextual word embeddings. We present two methods to combine word and contextual embeddings, then apply convolutional neural networks to capture semantic features. Experimental results on five commonly used datasets show the effectiveness of our proposed methods.
引用
收藏
页码:453 / 467
页数:15
相关论文
共 50 条
  • [1] Combining weighted category-aware contextual information in convolutional neural networks for text classification
    Xin Wu
    Yi Cai
    Qing Li
    Jingyun Xu
    Ho-fung Leung
    World Wide Web, 2020, 23 : 2815 - 2834
  • [2] Combining weighted category-aware contextual information in convolutional neural networks for text classification
    Wu, Xin
    Cai, Yi
    Li, Qing
    Xu, Jingyun
    Leung, Ho-fung
    WORLD WIDE WEB-INTERNET AND WEB INFORMATION SYSTEMS, 2020, 23 (05): : 2815 - 2834
  • [3] Quantum self-attention neural networks for text classification
    Li, Guangxi
    Zhao, Xuanqiang
    Wang, Xin
    SCIENCE CHINA-INFORMATION SCIENCES, 2024, 67 (04)
  • [4] Real world image tampering localization combining the self-attention mechanism and convolutional neural networks
    Zhong H.
    Bian S.
    Wang C.
    Xi'an Dianzi Keji Daxue Xuebao/Journal of Xidian University, 2024, 51 (01): : 135 - 146
  • [5] Combining Contextual Information by Integrated Attention Mechanism in Convolutional Neural Networks for Digital Elevation Model Super-Resolution
    Chen, Zhanlong
    Han, Xiaoyi
    Ma, Xiaochuan
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2024, 62
  • [6] Combining Gated Convolutional Networks and Self-Attention Mechanism for Speech Emotion Recognition
    Li, Chao
    Jiao, Jinlong
    Zhao, Yiqin
    Zhao, Ziping
    2019 8TH INTERNATIONAL CONFERENCE ON AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION WORKSHOPS AND DEMOS (ACIIW), 2019, : 105 - 109
  • [7] Probabilistic Matrix Factorization Recommendation of Self-Attention Mechanism Convolutional Neural Networks With Item Auxiliary Information
    Zhang, Chenkun
    Wang, Cheng
    IEEE ACCESS, 2020, 8 (08): : 208311 - 208321
  • [8] Convolutional Recurrent Neural Networks with a Self-Attention Mechanism for Personnel Performance Prediction
    Xue, Xia
    Feng, Jun
    Gao, Yi
    Liu, Meng
    Zhang, Wenyu
    Sun, Xia
    Zhao, Aiqi
    Guo, Shouxi
    ENTROPY, 2019, 21 (12)
  • [9] Combining Knowledge with Attention Neural Networks for Short Text Classification
    Li, Wei
    Li, Li
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, KSEM 2021, PT II, 2021, 12816 : 240 - 251
  • [10] Image Classification based on Self-attention Convolutional Neural Network
    Cai, Xiaohong
    Li, Ming
    Cao, Hui
    Ma, Jingang
    Wang, Xiaoyan
    Zhuang, Xuqiang
    SIXTH INTERNATIONAL WORKSHOP ON PATTERN RECOGNITION, 2021, 11913