Quantum self-attention neural networks for text classification

被引:10
|
作者
Li, Guangxi [1 ,2 ]
Zhao, Xuanqiang [1 ,3 ]
Wang, Xin [1 ,4 ]
机构
[1] Baidu Res, Inst Quantum Comp, Beijing 100193, Peoples R China
[2] Univ Technol Sydney, Ctr Quantum Software & Informat, Sydney, NSW 2007, Australia
[3] Univ Hong Kong, Dept Comp Sci, Quantum Informat & Computat Initiat QICI, Hong Kong 999077, Peoples R China
[4] Hong Kong Univ Sci & Technol Guangzhou, Thrust Artificial Intelligence, Informat Hub, Guangzhou 511453, Peoples R China
基金
澳大利亚研究理事会;
关键词
quantum neural networks; self-attention; natural language processing; text classification; parameterized quantum circuits;
D O I
10.1007/s11432-023-3879-7
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
An emerging direction of quantum computing is to establish meaningful quantum applications in various fields of artificial intelligence, including natural language processing (NLP). Although some efforts based on syntactic analysis have opened the door to research in quantum NLP (QNLP), limitations such as heavy syntactic preprocessing and syntax-dependent network architecture make them impracticable on larger and real-world data sets. In this paper, we propose a new simple network architecture, called the quantum self-attention neural network (QSANN), which can compensate for these limitations. Specifically, we introduce the self-attention mechanism into quantum neural networks and then utilize a Gaussian projected quantum self-attention serving as a sensible quantum version of self-attention. As a result, QSANN is effective and scalable on larger data sets and has the desirable property of being implementable on near-term quantum devices. In particular, our QSANN outperforms the best existing QNLP model based on syntactic analysis as well as a simple classical self-attention neural network in numerical experiments of text classification tasks on public data sets. We further show that our method exhibits robustness to low-level quantum noises and showcases resilience to quantum neural network architectures.
引用
收藏
页数:13
相关论文
共 50 条
  • [31] Multi-scale convolution networks for seismic event classification with windowed self-attention
    Huang, Yongming
    Xie, Yi
    Liu, Wei
    Ma, Yongsheng
    Miao, Fajun
    Zhang, Guobao
    JOURNAL OF SEISMOLOGY, 2025, 29 (01) : 257 - 268
  • [32] Industrial data classification using stochastic configuration networks with self-attention learning features
    Weitao Li
    Yali Deng
    Meishuang Ding
    Dianhui Wang
    Wei Sun
    Qiyue Li
    Neural Computing and Applications, 2022, 34 : 22047 - 22069
  • [33] DySAT: Deep Neural Representation Learning on Dynamic Graphs via Self-Attention Networks
    Sankar, Aravind
    Wu, Yanhong
    Gou, Liang
    Zhang, Wei
    Yang, Hao
    PROCEEDINGS OF THE 13TH INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING (WSDM '20), 2020, : 519 - 527
  • [34] Assessing the Impact of Attention and Self-Attention Mechanisms on the Classification of Skin Lesions
    Pedro, Rafael
    Oliveira, Arlindo L.
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [35] Automatic Food Recognition Using Deep Convolutional Neural Networks with Self-attention Mechanism
    Rahib Abiyev
    Joseph Adepoju
    Human-Centric Intelligent Systems, 2024, 4 (1): : 171 - 186
  • [36] A Study on Self-attention Mechanism for AMR-to-text Generation
    Vu Trong Sinh
    Nguyen Le Minh
    NATURAL LANGUAGE PROCESSING AND INFORMATION SYSTEMS (NLDB 2019), 2019, 11608 : 321 - 328
  • [37] An Effective Personality-Based Model for Short Text Sentiment Classification Using BiLSTM and Self-Attention
    Liu, Kejian
    Feng, Yuanyuan
    Zhang, Liying
    Wang, Rongju
    Wang, Wei
    Yuan, Xianzhi
    Cui, Xuran
    Li, Xianyong
    Li, Hailing
    ELECTRONICS, 2023, 12 (15)
  • [38] Web service classification based on self-attention mechanism
    Jia, Zhichun
    Zhang, Zhiying
    Dong, Rui
    Yang, Zhongxuan
    Xing, Xing
    2023 35TH CHINESE CONTROL AND DECISION CONFERENCE, CCDC, 2023, : 2164 - 2169
  • [39] Imbalanced Text Sentiment Classification Based on Multi-Channel BLTCN-BLSTM Self-Attention
    Cai, Tiantian
    Zhang, Xinsheng
    SENSORS, 2023, 23 (04)
  • [40] A Self-attention Based Model for Offline Handwritten Text Recognition
    Nam Tuan Ly
    Trung Tan Ngo
    Nakagawa, Masaki
    PATTERN RECOGNITION, ACPR 2021, PT II, 2022, 13189 : 356 - 369