A study of Knowledge Distillation in Fully Convolutional Network for Time Series Classification

被引:5
|
作者
Ay, Emel [1 ]
Devanne, Maxime [1 ]
Weber, Jonathan [1 ]
Forestier, Germain [1 ]
机构
[1] Univ Haute Alsace, IRIMAS, Mulhouse, France
关键词
Times Series Classification; Knowledge Distillation; Fully Convolutional Network;
D O I
10.1109/IJCNN55064.2022.9892915
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In recent years, deep learning revolutionized the field of machine learning. While many applications of deep learning are observed in computer vision, other domains like natural language processing (NLP) or speech recognition also benefited from advances in deep learning research. More recently, the field of time series analysis and more especially time series classification (TSC) also witnessed the emergence of deep neural networks providing competitive results. Through the years, the proposed network architectures became deeper and deeper pushing the performance higher. While these very deep models achieve impressive accuracy, their training and deployment became challenging. Indeed, a large number of GPUs is often required to train state-of-the-art networks and obtain high performances. While the requirements needed for the training step can be acceptable, deploying very deep neural networks can be difficult especially in embedded systems (e.g. robots) or devices with limited resources (e.g. web browsers, smartphones). In this context, knowledge distillation is a machine learning task consisting in transferring knowledge from a large model to a smaller one with fewer parameters. The goal is to create a lighter model mimicking the predictions of a larger one in order to obtain similar performances with a fraction of the computational cost. In this paper, we introduce and explore the concept of knowledge distillation for the specific task of TSC. We also present a first experimental study showing promising results on several datasets of the UCR time series archive. As current state-of-the-art models for TSC are deep and sometimes ensemble of models, we believe that knowledge distillation could become an important research area in the coming years.
引用
收藏
页数:8
相关论文
共 50 条
  • [1] Ensemble based fully convolutional transformer network for time series classification
    Dong, Yilin
    Xu, Yuzhuo
    Zhou, Rigui
    Zhu, Changming
    Liu, Jin
    Song, Jiamin
    Wu, Xinliang
    APPLIED INTELLIGENCE, 2024, 54 (19) : 8800 - 8819
  • [2] Random Subspace Ensembles of Fully Convolutional Network for Time Series Classification
    Zhang, Yangqianhui
    Mo, Chunyang
    Ma, Jiajun
    Zhao, Liang
    APPLIED SCIENCES-BASEL, 2021, 11 (22):
  • [3] LSTM Fully Convolutional Networks for Time Series Classification
    Karim, Fazle
    Majumdar, Somshubra
    Darabi, Houshang
    Chen, Shun
    IEEE ACCESS, 2018, 6 : 1662 - 1669
  • [4] Multi-Frequency Decomposition with Fully Convolutional Neural Network for Time Series Classification
    Han, Yongming
    Zhang, Shuheng
    Geng, Zhiqiang
    2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 284 - 289
  • [5] Insights Into LSTM Fully Convolutional Networks for Time Series Classification
    Karim, Fazle
    Majumdar, Somshubra
    Darabi, Houshang
    IEEE ACCESS, 2019, 7 : 67718 - 67725
  • [6] Fully convolutional networks with shapelet features for time series classification
    Ji, Cun
    Hu, Yupeng
    Liu, Shijun
    Pan, Li
    Li, Bo
    Zheng, Xiangwei
    INFORMATION SCIENCES, 2022, 612 : 835 - 847
  • [7] Lightweight convolutional neural network with knowledge distillation for cervical cells classification
    Chen, Wen
    Gao, Liang
    Li, Xinyu
    Shen, Weiming
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2022, 71
  • [8] Time Series Classification With Multivariate Convolutional Neural Network
    Liu, Chien-Liang
    Hsaio, Wen-Hoar
    Tu, Yao-Chung
    IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, 2019, 66 (06) : 4788 - 4797
  • [9] Regularizing Fully Convolutional Networks for Time Series Classification by Decorrelating Filters
    Paneri, Kaushal
    Vishnu, T. V.
    Malhotra, Pankaj
    Vig, Lovekesh
    Shroff, Gautam
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 10003 - 10004
  • [10] KDCTime: Knowledge distillation with calibration on InceptionTime for time-series classification
    Gong, Xueyuan
    Si, Yain-Whar
    Tian, Yongqi
    Lin, Cong
    Zhang, Xinyuan
    Liu, Xiaoxiang
    INFORMATION SCIENCES, 2022, 613 : 184 - 203