Knowledge transfer via distillation from time and frequency domain for time series classification

被引:0
|
作者
Kewei Ouyang
Yi Hou
Ye Zhang
Chao Ma
Shilin Zhou
机构
[1] National University of Defense Technology,College of Electronic Science and Technology
来源
Applied Intelligence | 2023年 / 53卷
关键词
Knowledge distillation; Time series classification; Domain fusion;
D O I
暂无
中图分类号
学科分类号
摘要
Although deep learning has achieved great success on time series classification, two issues are unsolved. First, existing methods mainly extract features in the single domain only, which means that useful information in the specific domain cannot be used. Second, multi-domain learning usually leads to an increase in the size of the model which makes it difficult to deploy on mobile devices. In this this study, a lightweight double-branch model, called Time Frequency Knowledge Reception Network (TFKR-Net) is proposed to simultaneously fuse information from the time and frequency domains. Instead of directly merging knowledge from the teacher models pretrained in different domains, TFKR-Net independently distills knowledge from the teacher models in the time and frequency domains, which helps maintain knowledge diversity. Experimental results on the UCR (University of California, Riverside) archive demonstrate that the TFKR-Net significantly reduces the model size and improves computational efficiency with a little performance loss in classification accuracy.
引用
收藏
页码:1505 / 1516
页数:11
相关论文
共 50 条
  • [1] Knowledge transfer via distillation from time and frequency domain for time series classification
    Ouyang, Kewei
    Hou, Yi
    Zhang, Ye
    Ma, Chao
    Zhou, Shilin
    APPLIED INTELLIGENCE, 2023, 53 (02) : 1505 - 1516
  • [2] Time series clustering and classification via frequency domain methods
    Holan, Scott H.
    Ravishanker, Nalini
    WILEY INTERDISCIPLINARY REVIEWS-COMPUTATIONAL STATISTICS, 2018, 10 (06):
  • [3] SDKT: Similar Domain Knowledge Transfer for Multivariate Time Series Classification Tasks
    Wen, Jiaye
    Zhou, Wenan
    COMPUTATIONAL INTELLIGENCE, 2024, 40 (06)
  • [4] KDCTime: Knowledge distillation with calibration on InceptionTime for time-series classification
    Gong, Xueyuan
    Si, Yain-Whar
    Tian, Yongqi
    Lin, Cong
    Zhang, Xinyuan
    Liu, Xiaoxiang
    INFORMATION SCIENCES, 2022, 613 : 184 - 203
  • [5] A study of Knowledge Distillation in Fully Convolutional Network for Time Series Classification
    Ay, Emel
    Devanne, Maxime
    Weber, Jonathan
    Forestier, Germain
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [6] CLASSIFICATION OF HUMAN HORMONAL TIME SERIES IN FREQUENCY DOMAIN.
    Wang, D.C.C.
    Vagnucci, A.H.
    Modeling and Simulation, Proceedings of the Annual Pittsburgh Conference, 1600,
  • [7] Data Augmentation for Industrial Multivariate Time Series via a Spatial and Frequency Domain Knowledge GAN
    Lin, Jui Chien
    Yang, Fan
    2022 IEEE INTERNATIONAL SYMPOSIUM ON ADVANCED CONTROL OF INDUSTRIAL PROCESSES (ADCONIP 2022), 2022, : 244 - 249
  • [8] Time and frequency-domain feature fusion network for multivariate time series classification
    Lei, Tianyang
    Li, Jichao
    Yang, Kewei
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 252
  • [9] Reinforced Knowledge Distillation for Time Series Regression
    Xu Q.
    Wu K.
    Wu M.
    Mao K.
    Li X.
    Chen Z.
    IEEE Transactions on Artificial Intelligence, 2024, 5 (06): : 3184 - 3194
  • [10] Compression of Time Series Classification Model MC-MHLF using Knowledge Distillation
    Gengyo, Akari
    Tamura, Keiichi
    2021 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2021, : 22 - 27