Quantum continual learning of quantum data realizing knowledge backward transfer

被引:3
作者
Situ, Haozhen [1 ]
Lu, Tianxiang [1 ]
Pan, Minghua [2 ]
Li, Lvzhou [3 ]
机构
[1] South China Agr Univ, Coll Math & Informat, Guangzhou 510642, Peoples R China
[2] Guilin Univ Elect Technol, Guangxi Key Lab Cryptog & Informat Secur, Guilin 541004, Peoples R China
[3] Sun Yat Sen Univ, Inst Quantum Comp & Comp Theory, Sch Comp Sci & Engn, Guangzhou 510006, Peoples R China
基金
中国国家自然科学基金;
关键词
Quantum machine learning; Variational quantum algorithm; Continual learning;
D O I
10.1016/j.physa.2023.128779
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
For the goal of strong artificial intelligence that can mimic human-level intelligence, AI systems would have the ability to adapt to ever-changing scenarios and learn new knowledge continuously without forgetting previously acquired knowledge. When a machine learning model is consecutively trained on multiple tasks that come in sequence, its performance on previously learned tasks may drop dramatically during the learning process of the newly seen task. To avoid this phenomenon termed catastrophic forgetting, continual learning, also known as lifelong learning, has been proposed and become one of the most up-to-date research areas of machine learning. As quantum machine learning blossoms in recent years, it is interesting to develop quantum continual learning. This paper focuses on the case of quantum models for quantum data where the computation model and the data to be processed are both quantum. The gradient episodic memory method is incorporated to design a quantum continual learning scheme that overcomes catastrophic forgetting and realizes knowledge backward transfer. Specifically, a sequence of quantum state classification tasks is continually learned by a variational quantum classifier whose parameters are optimized by a classical gradient-based optimizer. The gradient of the current task is projected to the closest gradient, avoiding the increase of the loss at previous tasks, but allowing the decrease. Numerical simulation results show that our scheme not only overcomes catastrophic forgetting, but also realize knowledge backward transfer, which means the classifier's performance on previous tasks is enhanced rather than compromised while learning a new task. & COPY; 2023 Published by Elsevier B.V.
引用
收藏
页数:8
相关论文
共 48 条
  • [1] Quantum Boltzmann Machine
    Amin, Mohammad H.
    Andriyash, Evgeny
    Rolfe, Jason
    Kulchytskyy, Bohdan
    Melko, Roger
    [J]. PHYSICAL REVIEW X, 2018, 8 (02):
  • [2] A generative modeling approach for benchmarking and training shallow quantum circuits
    Benedetti, Marcello
    Garcia-Pintos, Delfina
    Perdomo, Oscar
    Leyton-Ortega, Vicente
    Nam, Yunseong
    Perdomo-Ortiz, Alejandro
    [J]. NPJ QUANTUM INFORMATION, 2019, 5 (1)
  • [3] Bergholm V, 2022, Arxiv, DOI [arXiv:1811.04968, DOI 10.48550/ARXIV.1811.04968]
  • [4] Chakraborty Shantanav, 2019, arXiv
  • [5] Quantum convolutional neural networks
    Cong, Iris
    Choi, Soonwon
    Lukin, Mikhail D.
    [J]. NATURE PHYSICS, 2019, 15 (12) : 1273 - +
  • [6] Dunjko V., 2020, QUANTUM, V4, P32, DOI [10.22331/qv-2020-03-17-32, DOI 10.22331/QV-2020-03-17-32.URL:HTTPS://D0I.0RG/10.22331/QV-2020-03-17-32]
  • [7] Farhi E, 2014, Arxiv, DOI [arXiv:1411.4028, DOI 10.48550/ARXIV.1411.4028]
  • [8] Goodfellow I, 2016, ADAPT COMPUT MACH LE, P1
  • [9] Quantum Algorithm for Linear Systems of Equations
    Harrow, Aram W.
    Hassidim, Avinatan
    Lloyd, Seth
    [J]. PHYSICAL REVIEW LETTERS, 2009, 103 (15)
  • [10] Supervised learning with quantum-enhanced feature spaces
    Havlicek, Vojtech
    Corcoles, Antonio D.
    Temme, Kristan
    Harrow, Aram W.
    Kandala, Abhinav
    Chow, Jerry M.
    Gambetta, Jay M.
    [J]. NATURE, 2019, 567 (7747) : 209 - 212