Quantum self-attention neural networks for text classification

被引:10
|
作者
Li, Guangxi [1 ,2 ]
Zhao, Xuanqiang [1 ,3 ]
Wang, Xin [1 ,4 ]
机构
[1] Baidu Res, Inst Quantum Comp, Beijing 100193, Peoples R China
[2] Univ Technol Sydney, Ctr Quantum Software & Informat, Sydney, NSW 2007, Australia
[3] Univ Hong Kong, Dept Comp Sci, Quantum Informat & Computat Initiat QICI, Hong Kong 999077, Peoples R China
[4] Hong Kong Univ Sci & Technol Guangzhou, Thrust Artificial Intelligence, Informat Hub, Guangzhou 511453, Peoples R China
基金
澳大利亚研究理事会;
关键词
quantum neural networks; self-attention; natural language processing; text classification; parameterized quantum circuits;
D O I
10.1007/s11432-023-3879-7
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
An emerging direction of quantum computing is to establish meaningful quantum applications in various fields of artificial intelligence, including natural language processing (NLP). Although some efforts based on syntactic analysis have opened the door to research in quantum NLP (QNLP), limitations such as heavy syntactic preprocessing and syntax-dependent network architecture make them impracticable on larger and real-world data sets. In this paper, we propose a new simple network architecture, called the quantum self-attention neural network (QSANN), which can compensate for these limitations. Specifically, we introduce the self-attention mechanism into quantum neural networks and then utilize a Gaussian projected quantum self-attention serving as a sensible quantum version of self-attention. As a result, QSANN is effective and scalable on larger data sets and has the desirable property of being implementable on near-term quantum devices. In particular, our QSANN outperforms the best existing QNLP model based on syntactic analysis as well as a simple classical self-attention neural network in numerical experiments of text classification tasks on public data sets. We further show that our method exhibits robustness to low-level quantum noises and showcases resilience to quantum neural network architectures.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Quantum self-attention neural networks for text classification
    Guangxi LI
    Xuanqiang ZHAO
    Xin WANG
    ScienceChina(InformationSciences), 2024, 67 (04) : 301 - 313
  • [2] Combining Contextual Information by Self-attention Mechanism in Convolutional Neural Networks for Text Classification
    Wu, Xin
    Cai, Yi
    Li, Qing
    Xu, Jingyun
    Leung, Ho-fung
    WEB INFORMATION SYSTEMS ENGINEERING, WISE 2018, PT I, 2018, 11233 : 453 - 467
  • [3] Deformable Self-Attention for Text Classification
    Ma, Qianli
    Yan, Jiangyue
    Lin, Zhenxi
    Yu, Liuhong
    Chen, Zipeng
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2021, 29 : 1570 - 1581
  • [4] Self-Attention Enhanced Recurrent Neural Networks for Sentence Classification
    Kumar, Ankit
    Rastogi , Reshma
    2018 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI), 2018, : 905 - 911
  • [5] Adaptive Feature Self-Attention in Spiking Neural Networks for Hyperspectral Classification
    Li, Heng
    Tu, Bing
    Liu, Bo
    Li, Jun
    Plaza, Antonio
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2025, 63
  • [6] Multi-Scale Self-Attention for Text Classification
    Guo, Qipeng
    Qiu, Xipeng
    Liu, Pengfei
    Xue, Xiangyang
    Zhang, Zheng
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 7847 - 7854
  • [7] A Self-attention Based LSTM Network for Text Classification
    Jing, Ran
    2019 3RD INTERNATIONAL CONFERENCE ON CONTROL ENGINEERING AND ARTIFICIAL INTELLIGENCE (CCEAI 2019), 2019, 1207
  • [8] Multiple Positional Self-Attention Network for Text Classification
    Dai, Biyun
    Li, Jinlong
    Xu, Ruoyi
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 7610 - 7617
  • [9] Research on Time Series Prediction via Quantum Self-Attention Neural Networks
    Chen X.
    Li C.
    Jin F.
    Dianzi Keji Daxue Xuebao/Journal of the University of Electronic Science and Technology of China, 2024, 53 (01): : 110 - 118
  • [10] Dual-axial self-attention network for text classification
    Zhang, Xiaochuan
    Qiu, Xipeng
    Pang, Jianmin
    Liu, Fudong
    Li, Xingwei
    SCIENCE CHINA-INFORMATION SCIENCES, 2021, 64 (12)