Quantum self-attention neural networks for text classification

被引:10
|
作者
Li, Guangxi [1 ,2 ]
Zhao, Xuanqiang [1 ,3 ]
Wang, Xin [1 ,4 ]
机构
[1] Baidu Res, Inst Quantum Comp, Beijing 100193, Peoples R China
[2] Univ Technol Sydney, Ctr Quantum Software & Informat, Sydney, NSW 2007, Australia
[3] Univ Hong Kong, Dept Comp Sci, Quantum Informat & Computat Initiat QICI, Hong Kong 999077, Peoples R China
[4] Hong Kong Univ Sci & Technol Guangzhou, Thrust Artificial Intelligence, Informat Hub, Guangzhou 511453, Peoples R China
基金
澳大利亚研究理事会;
关键词
quantum neural networks; self-attention; natural language processing; text classification; parameterized quantum circuits;
D O I
10.1007/s11432-023-3879-7
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
An emerging direction of quantum computing is to establish meaningful quantum applications in various fields of artificial intelligence, including natural language processing (NLP). Although some efforts based on syntactic analysis have opened the door to research in quantum NLP (QNLP), limitations such as heavy syntactic preprocessing and syntax-dependent network architecture make them impracticable on larger and real-world data sets. In this paper, we propose a new simple network architecture, called the quantum self-attention neural network (QSANN), which can compensate for these limitations. Specifically, we introduce the self-attention mechanism into quantum neural networks and then utilize a Gaussian projected quantum self-attention serving as a sensible quantum version of self-attention. As a result, QSANN is effective and scalable on larger data sets and has the desirable property of being implementable on near-term quantum devices. In particular, our QSANN outperforms the best existing QNLP model based on syntactic analysis as well as a simple classical self-attention neural network in numerical experiments of text classification tasks on public data sets. We further show that our method exhibits robustness to low-level quantum noises and showcases resilience to quantum neural network architectures.
引用
收藏
页数:13
相关论文
共 50 条
  • [21] Speech emotion recognition using recurrent neural networks with directional self-attention
    Li, Dongdong
    Liu, Jinlin
    Yang, Zhuo
    Sun, Linyu
    Wang, Zhe
    EXPERT SYSTEMS WITH APPLICATIONS, 2021, 173
  • [22] Enhancing Multimodal Patterns in Neuroimaging by Siamese Neural Networks with Self-Attention Mechanism
    Arco, Juan E.
    Ortiz, Andres
    Gallego-Molina, Nicolas J.
    Gorriz, Juan M.
    Ramirez, Javier
    INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 2023, 33 (04)
  • [23] Enabling Energy-Efficient Inference for Self-Attention Mechanisms in Neural Networks
    Chen, Qinyu
    Sun, Congyi
    Lu, Zhonghai
    Gao, Chang
    2022 IEEE INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE CIRCUITS AND SYSTEMS (AICAS 2022): INTELLIGENT TECHNOLOGY IN THE POST-PANDEMIC ERA, 2022, : 25 - 28
  • [24] Text Classification Research with Attention-based Recurrent Neural Networks
    Du, C.
    Huang, L.
    INTERNATIONAL JOURNAL OF COMPUTERS COMMUNICATIONS & CONTROL, 2018, 13 (01) : 50 - 61
  • [25] A Neural Network Based Text Classification with Attention Mechanism
    Lu SiChen
    PROCEEDINGS OF 2019 IEEE 7TH INTERNATIONAL CONFERENCE ON COMPUTER SCIENCE AND NETWORK TECHNOLOGY (ICCSNT 2019), 2019, : 333 - 338
  • [26] SA-SGRU: Combining Improved Self-Attention and Skip-GRU for Text Classification
    Huang, Yuan
    Dai, Xiaohong
    Yu, Junhao
    Huang, Zheng
    APPLIED SCIENCES-BASEL, 2023, 13 (03):
  • [27] Classifying cancer pathology reports with hierarchical self-attention networks
    Gao, Shang
    Qiu, John X.
    Alawad, Mohammed
    Hinkle, Jacob D.
    Schaefferkoetter, Noah
    Yoon, Hong-Jun
    Christian, Blair
    Fearn, Paul A.
    Penberthy, Lynne
    Wu, Xiao-Cheng
    Coyle, Linda
    Tourassi, Georgia
    Ramanathan, Arvind
    ARTIFICIAL INTELLIGENCE IN MEDICINE, 2019, 101
  • [28] High Utility Neural Networks for Text Classification
    Wu Y.-J.
    Li J.
    Song C.-F.
    Chang J.
    Tien Tzu Hsueh Pao/Acta Electronica Sinica, 2020, 48 (02): : 279 - 284
  • [29] Improving Self-Attention Networks With Sequential Relations
    Zheng, Zaixiang
    Huang, Shujian
    Weng, Rongxiang
    Dai, Xinyu
    Chen, Jiajun
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2020, 28 : 1707 - 1716
  • [30] Industrial data classification using stochastic configuration networks with self-attention learning features
    Li, Weitao
    Deng, Yali
    Ding, Meishuang
    Wang, Dianhui
    Sun, Wei
    Li, Qiyue
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (24) : 22047 - 22069