Quantum self-attention neural networks for text classification

被引:10
|
作者
Li, Guangxi [1 ,2 ]
Zhao, Xuanqiang [1 ,3 ]
Wang, Xin [1 ,4 ]
机构
[1] Baidu Res, Inst Quantum Comp, Beijing 100193, Peoples R China
[2] Univ Technol Sydney, Ctr Quantum Software & Informat, Sydney, NSW 2007, Australia
[3] Univ Hong Kong, Dept Comp Sci, Quantum Informat & Computat Initiat QICI, Hong Kong 999077, Peoples R China
[4] Hong Kong Univ Sci & Technol Guangzhou, Thrust Artificial Intelligence, Informat Hub, Guangzhou 511453, Peoples R China
基金
澳大利亚研究理事会;
关键词
quantum neural networks; self-attention; natural language processing; text classification; parameterized quantum circuits;
D O I
10.1007/s11432-023-3879-7
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
An emerging direction of quantum computing is to establish meaningful quantum applications in various fields of artificial intelligence, including natural language processing (NLP). Although some efforts based on syntactic analysis have opened the door to research in quantum NLP (QNLP), limitations such as heavy syntactic preprocessing and syntax-dependent network architecture make them impracticable on larger and real-world data sets. In this paper, we propose a new simple network architecture, called the quantum self-attention neural network (QSANN), which can compensate for these limitations. Specifically, we introduce the self-attention mechanism into quantum neural networks and then utilize a Gaussian projected quantum self-attention serving as a sensible quantum version of self-attention. As a result, QSANN is effective and scalable on larger data sets and has the desirable property of being implementable on near-term quantum devices. In particular, our QSANN outperforms the best existing QNLP model based on syntactic analysis as well as a simple classical self-attention neural network in numerical experiments of text classification tasks on public data sets. We further show that our method exhibits robustness to low-level quantum noises and showcases resilience to quantum neural network architectures.
引用
收藏
页数:13
相关论文
共 50 条
  • [21] SELF-ATTENTION NETWORKS FOR CONNECTIONIST TEMPORAL CLASSIFICATION IN SPEECH RECOGNITION
    Salazar, Julian
    Kirchhoff, Katrin
    Huang, Zhiheng
    2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 7115 - 7119
  • [22] Self-attention Networks for Non-recurrent Handwritten Text Recognition
    d'Arce, Rafael
    Norton, Terence
    Hannuna, Sion
    Cristianini, Nello
    FRONTIERS IN HANDWRITING RECOGNITION, ICFHR 2022, 2022, 13639 : 389 - 403
  • [23] SELF-ATTENTION AND RETRIEVAL ENHANCED NEURAL NETWORKS FOR ESSAY GENERATION
    Wang, Wei
    Zheng, Hai-Tao
    Lin, Zibo
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 8199 - 8203
  • [24] Untangling tradeoffs between recurrence and self-attention in neural networks
    Kerg, Giancarlo
    Kanuparthi, Bhargav
    Goyal, Anirudh
    Goyette, Kyle
    Bengio, Yoshua
    Lajoie, Guillaume
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [25] Convolutional Self-Attention Networks
    Yang, Baosong
    Wang, Longyue
    Wong, Derek F.
    Chao, Lidia S.
    Tu, Zhaopeng
    2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, 2019, : 4040 - 4045
  • [26] Spectral-Spatial Self-Attention Networks for Hyperspectral Image Classification
    Zhang, Xuming
    Sun, Genyun
    Jia, Xiuping
    Wu, Lixin
    Zhang, Aizhu
    Ren, Jinchang
    Fu, Hang
    Yao, Yanjuan
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60
  • [27] Combining Knowledge with Attention Neural Networks for Short Text Classification
    Li, Wei
    Li, Li
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, KSEM 2021, PT II, 2021, 12816 : 240 - 251
  • [28] Deep Attention Diffusion Graph Neural Networks for Text Classification
    Liu, Yonghao
    Guan, Renchu
    Giunchiglia, Fausto
    Liang, Yanchun
    Feng, Xiaoyue
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 8142 - 8152
  • [29] Applying Self-attention for Stance Classification
    Bugueno, Margarita
    Mendoza, Marcelo
    PROGRESS IN PATTERN RECOGNITION, IMAGE ANALYSIS, COMPUTER VISION, AND APPLICATIONS (CIARP 2019), 2019, 11896 : 51 - 61
  • [30] Global Convolutional Neural Networks With Self-Attention for Fisheye Image Rectification
    Kim, Byunghyun
    Lee, Dohyun
    Min, Kyeongyuk
    Chong, Jongwha
    Joe, Inwhee
    IEEE ACCESS, 2022, 10 : 129580 - 129587