Interpretable Time-series Classification on Few-shot Samples

被引:11
|
作者
Tang, Wensi [1 ]
Liu, Lu [1 ]
Long, Guodong [1 ]
机构
[1] Univ Technol Sydney, FEIT, Ctr AI, Sydney, NSW, Australia
关键词
time series classification; few-shot learning; interpretability;
D O I
10.1109/ijcnn48605.2020.9206860
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recent few-shot learning works focus on training a model with prior meta-knowledge to fast adapt to new tasks with unseen classes and samples. However, conventional time-series classification algorithms fail to tackle the few-shot scenario. Existing few-shot learning methods arc proposed to tackle image or text data, and most of them are neural based models that lack interpretability. This paper proposes an interpretable neural-based framework, namely Dual Prototypical Shapelet Networks (DPSN) for few-shot time-series classification, which not only trains a neural network-based model but also interprets the model from dual granularity: I) global overview using representative time series samples, and 2) local highlights using discriminative shapelets. In particular, the generated dual prototypical shapelets consist of representative samples that can mostly demonstrate the overall shapes of all samples in the class and discriminative partial-length shapelets that can be used to distinguish different classes. We have derived 18 few-shot TSC datasets from public benchmark datasets and evaluated the proposed method by comparing with baselines. The DPSN framework outperforms state-of-the-art time-series classification methods, especially when training with limited amounts of data. Several case studies have been given to demonstrate the interpret ability of our model.
引用
收藏
页数:8
相关论文
共 50 条
  • [1] Few-Shot Forecasting of Time-Series with Heterogeneous Channels
    Brinkmeyer, Lukas
    Drumond, Rafael Rego
    Burchert, Johannes
    Schmidt-Thieme, Lars
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2022, PT VI, 2023, 13718 : 3 - 18
  • [2] Online Few-Shot Time Series Classification for Aftershock Detection
    Zhong, Sheng
    Souza, Vinicius M. A.
    Baker, Glenn Eli
    Mueen, Abdullah
    PROCEEDINGS OF THE 29TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2023, 2023, : 5707 - 5716
  • [3] META-LEARNING FOR FEW-SHOT TIME SERIES CLASSIFICATION
    Wang, Sherrie
    Russwurm, Marc
    Koerner, Marco
    Lobell, David B.
    IGARSS 2020 - 2020 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM, 2020, : 7041 - 7044
  • [4] Meta-Learning for Few-Shot Time Series Classification
    Narwariya, Jyoti
    Malhotra, Pankaj
    Vig, Lovekesh
    Shroff, Gautam
    Vishnu, T. V.
    PROCEEDINGS OF THE 7TH ACM IKDD CODS AND 25TH COMAD (CODS-COMAD 2020), 2020, : 28 - 36
  • [5] Few-shot time-series anomaly detection with unsupervised domain adaptation
    Li, Hongbo
    Zheng, Wenli
    Tang, Feilong
    Zhu, Yanmin
    Huang, Jielong
    INFORMATION SCIENCES, 2023, 649
  • [6] Few-shot Time-Series Forecasting with Application for Vehicular Traffic Flow
    Tran, Victor
    Panangadan, Anand
    2022 IEEE 23RD INTERNATIONAL CONFERENCE ON INFORMATION REUSE AND INTEGRATION FOR DATA SCIENCE (IRI 2022), 2022, : 20 - 26
  • [7] Generating Representative Samples for Few-Shot Classification
    Xu, Jingyi
    Le, Hieu
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 8993 - 9003
  • [8] A Comparative Study of Few-Shot Learning Methods for 1-D ECG Time-series Classification
    Gupta, Priyanka
    Murugan, Arunachala Amuda
    Chordia, Deep
    Yannam, Pavan Kumar Reddy
    Gupta, Manik
    2024 IEEE 12TH INTERNATIONAL CONFERENCE ON HEALTHCARE INFORMATICS, ICHI 2024, 2024, : 604 - 611
  • [9] Meta-Feature Fusion for Few-Shot Time Series Classification
    Park, Seo-Hyeong
    Syazwany, Nur Suriza
    Lee, Sang-Chul
    IEEE ACCESS, 2023, 11 : 41400 - 41414
  • [10] Self-Supervised Few-Shot Time-Series Segmentation for Activity Recognition
    Xiao, Chunjing
    Chen, Shiming
    Zhou, Fan
    Wu, Jie
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2023, 22 (11) : 6770 - 6783