Combining Contextual Information by Self-attention Mechanism in Convolutional Neural Networks for Text Classification

被引:14
|
作者
Wu, Xin [1 ]
Cai, Yi [1 ]
Li, Qing [2 ]
Xu, Jingyun [1 ]
Leung, Ho-fung [3 ]
机构
[1] South China Univ Technol, Sch Software Engn, Guangzhou, Peoples R China
[2] City Univ Hong Kong, Dept Comp Sci, Kowloon, Hong Kong, Peoples R China
[3] Chinese Univ Hong Kong, Dept Comp Sci & Engn, Kowloon, Hong Kong, Peoples R China
基金
对外科技合作项目(国际科技项目);
关键词
Convolutional neural networks; Text classification; Attention mechanism; Word representation;
D O I
10.1007/978-3-030-02922-7_31
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Convolutional neural networks (CNN) are widely used in many NLP tasks, which can employ convolutional filters to capture useful semantic features of texts. However, convolutional filters with small window size may lose global context information of texts, simply increasing window size will bring the problems of data sparsity and enormous parameters. To capture global context information, we propose to use the self-attention mechanism to obtain contextual word embeddings. We present two methods to combine word and contextual embeddings, then apply convolutional neural networks to capture semantic features. Experimental results on five commonly used datasets show the effectiveness of our proposed methods.
引用
收藏
页码:453 / 467
页数:15
相关论文
共 50 条
  • [1] Quantum self-attention neural networks for text classification
    Li, Guangxi
    Zhao, Xuanqiang
    Wang, Xin
    SCIENCE CHINA-INFORMATION SCIENCES, 2024, 67 (04)
  • [2] Quantum self-attention neural networks for text classification
    Guangxi LI
    Xuanqiang ZHAO
    Xin WANG
    ScienceChina(InformationSciences), 2024, 67 (04) : 301 - 313
  • [3] Combining weighted category-aware contextual information in convolutional neural networks for text classification
    Xin Wu
    Yi Cai
    Qing Li
    Jingyun Xu
    Ho-fung Leung
    World Wide Web, 2020, 23 : 2815 - 2834
  • [4] Combining weighted category-aware contextual information in convolutional neural networks for text classification
    Wu, Xin
    Cai, Yi
    Li, Qing
    Xu, Jingyun
    Leung, Ho-fung
    WORLD WIDE WEB-INTERNET AND WEB INFORMATION SYSTEMS, 2020, 23 (05): : 2815 - 2834
  • [5] Combining convolutional neural networks and self-attention for fundus diseases identification
    Wang, Keya
    Xu, Chuanyun
    Li, Gang
    Zhang, Yang
    Zheng, Yu
    Sun, Chengjie
    SCIENTIFIC REPORTS, 2023, 13 (01)
  • [6] Combining convolutional neural networks and self-attention for fundus diseases identification
    Keya Wang
    Chuanyun Xu
    Gang Li
    Yang Zhang
    Yu Zheng
    Chengjie Sun
    Scientific Reports, 13
  • [7] Real world image tampering localization combining the self-attention mechanism and convolutional neural networks
    Zhong H.
    Bian S.
    Wang C.
    Xi'an Dianzi Keji Daxue Xuebao/Journal of Xidian University, 2024, 51 (01): : 135 - 146
  • [8] Deep Pyramid Convolutional Neural Network Integrated with Self-attention Mechanism and Highway Network for Text Classification
    Li, Xuewei
    Ning, Hongyun
    4TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE APPLICATIONS AND TECHNOLOGIES (AIAAT 2020), 2020, 1642
  • [9] Combining Gated Convolutional Networks and Self-Attention Mechanism for Speech Emotion Recognition
    Li, Chao
    Jiao, Jinlong
    Zhao, Yiqin
    Zhao, Ziping
    2019 8TH INTERNATIONAL CONFERENCE ON AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION WORKSHOPS AND DEMOS (ACIIW), 2019, : 105 - 109
  • [10] Probabilistic Matrix Factorization Recommendation of Self-Attention Mechanism Convolutional Neural Networks With Item Auxiliary Information
    Zhang, Chenkun
    Wang, Cheng
    IEEE ACCESS, 2020, 8 (08): : 208311 - 208321