SelfMatch: Robust semisupervised time-series classification with self-distillation

被引:98
作者
Xing, Huanlai [1 ]
Xiao, Zhiwen [1 ]
Zhan, Dawei [1 ]
Luo, Shouxi [1 ]
Dai, Penglin [1 ]
Li, Ke [1 ]
机构
[1] Southwest Jiaotong Univ, Sch Comp & Artificial Intelligence, Chengdu 611756, Peoples R China
基金
中国国家自然科学基金;
关键词
data mining; deep learning; knowledge distillation; semisupervised learning; time-series classification; REPRESENTATION; NETWORK;
D O I
10.1002/int.22957
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Over the years, a number of semisupervised deep-learning algorithms have been proposed for time-series classification (TSC). In semisupervised deep learning, from the point of view of representation hierarchy, semantic information extracted from lower levels is the basis of that extracted from higher levels. The authors wonder if high-level semantic information extracted is also helpful for capturing low-level semantic information. This paper studies this problem and proposes a robust semisupervised model with self-distillation (SD) that simplifies existing semisupervised learning (SSL) techniques for TSC, called SelfMatch. SelfMatch hybridizes supervised learning, unsupervised learning, and SD. In unsupervised learning, SelfMatch applies pseudolabeling to feature extraction on labeled data. A weakly augmented sequence is used as a target to guide the prediction of a Timecut-augmented version of the same sequence. SD promotes the knowledge flow from higher to lower levels, guiding the extraction of low-level semantic information. This paper designs a feature extractor for TSC, called ResNet-LSTMaN, responsible for feature and relation extraction. The experimental results show that SelfMatch achieves excellent SSL performance on 35 widely adopted UCR2018 data sets, compared with a number of state-of-the-art semisupervised and supervised algorithms.
引用
收藏
页码:8583 / 8610
页数:28
相关论文
共 60 条
[1]  
Bagnall A, 2016, PROC INT CONF DATA, P1548, DOI 10.1109/ICDE.2016.7498418
[2]   Time series classification based on multi-feature dictionary representation and ensemble learning [J].
Bai, Bing ;
Li, Guiling ;
Wang, Senzhang ;
Wu, Zongda ;
Yan, Wenhe .
EXPERT SYSTEMS WITH APPLICATIONS, 2021, 169 (169)
[3]   Time series representation and similarity based on local autopatterns [J].
Baydogan, Mustafa Gokce ;
Runger, George .
DATA MINING AND KNOWLEDGE DISCOVERY, 2016, 30 (02) :476-509
[4]   A Bag-of-Features Framework to Classify Time Series [J].
Baydogan, Mustafa Gokce ;
Runger, George ;
Tuv, Eugene .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2013, 35 (11) :2796-2802
[5]  
Berthelot D., 2020, P INT C LEARN REPR
[6]  
Berthelot D, 2019, ADV NEUR IN, V32
[7]   Reservoir Computing Approaches for Representation and Classification of Multivariate Time Series [J].
Bianchi, Filippo Maria ;
Scardapane, Simone ;
Lokse, Sigurd ;
Jenssen, Robert .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (05) :2169-2179
[8]  
Chen DF, 2020, AAAI CONF ARTIF INTE, V34, P3430
[9]  
Chen YP, 2013, 19TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING (KDD'13), P383
[10]   Improved autoencoder for unsupervised anomaly detection [J].
Cheng, Zhen ;
Wang, Siwei ;
Zhang, Pei ;
Wang, Siqi ;
Liu, Xinwang ;
Zhu, En .
INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2021, 36 (12) :7103-7125