SelfMatch: Robust semisupervised time-series classification with self-distillation

被引:98
作者
Xing, Huanlai [1 ]
Xiao, Zhiwen [1 ]
Zhan, Dawei [1 ]
Luo, Shouxi [1 ]
Dai, Penglin [1 ]
Li, Ke [1 ]
机构
[1] Southwest Jiaotong Univ, Sch Comp & Artificial Intelligence, Chengdu 611756, Peoples R China
基金
中国国家自然科学基金;
关键词
data mining; deep learning; knowledge distillation; semisupervised learning; time-series classification; REPRESENTATION; NETWORK;
D O I
10.1002/int.22957
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Over the years, a number of semisupervised deep-learning algorithms have been proposed for time-series classification (TSC). In semisupervised deep learning, from the point of view of representation hierarchy, semantic information extracted from lower levels is the basis of that extracted from higher levels. The authors wonder if high-level semantic information extracted is also helpful for capturing low-level semantic information. This paper studies this problem and proposes a robust semisupervised model with self-distillation (SD) that simplifies existing semisupervised learning (SSL) techniques for TSC, called SelfMatch. SelfMatch hybridizes supervised learning, unsupervised learning, and SD. In unsupervised learning, SelfMatch applies pseudolabeling to feature extraction on labeled data. A weakly augmented sequence is used as a target to guide the prediction of a Timecut-augmented version of the same sequence. SD promotes the knowledge flow from higher to lower levels, guiding the extraction of low-level semantic information. This paper designs a feature extractor for TSC, called ResNet-LSTMaN, responsible for feature and relation extraction. The experimental results show that SelfMatch achieves excellent SSL performance on 35 widely adopted UCR2018 data sets, compared with a number of state-of-the-art semisupervised and supervised algorithms.
引用
收藏
页码:8583 / 8610
页数:28
相关论文
共 60 条
[11]   A time series forest for classification and feature extraction [J].
Deng, Houtao ;
Runger, George ;
Tuv, Eugene ;
Vladimir, Martyanov .
INFORMATION SCIENCES, 2013, 239 :142-153
[12]  
DeVries T., 2015, P ICML, P448
[13]   Comparison between deep learning and fully connected neural network in performance prediction of power cycles: Taking supercritical CO2 Brayton cycle as an example [J].
Diao, Chenghao ;
Liu, Tianye ;
Yang, Zhen ;
Duan, Yuanyuan .
INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2021, 36 (12) :7682-7708
[14]  
Fauvel F., 2020, ARXIV PREPRINT ARXIV
[15]   InceptionTime: Finding AlexNet for time series classification [J].
Fawaz, Hassan Ismail ;
Lucas, Benjamin ;
Forestier, Germain ;
Pelletier, Charlotte ;
Schmidt, Daniel F. ;
Weber, Jonathan ;
Webb, Geoffrey, I ;
Idoumghar, Lhassane ;
Muller, Pierre-Alain ;
Petitjean, Francois .
DATA MINING AND KNOWLEDGE DISCOVERY, 2020, 34 (06) :1936-1962
[16]   Deep learning for time series classification: a review [J].
Fawaz, Hassan Ismail ;
Forestier, Germain ;
Weber, Jonathan ;
Idoumghar, Lhassane ;
Muller, Pierre-Alain .
DATA MINING AND KNOWLEDGE DISCOVERY, 2019, 33 (04) :917-963
[17]  
Franceschi JY, 2019, ADV NEUR IN, V32
[18]   Self-labeling techniques for semi-supervised time series classification: an empirical study [J].
Gonzalez, Mabel ;
Bergmeir, Christoph ;
Triguero, Isaac ;
Rodriguez, Yanet ;
Benitez, Jose M. .
KNOWLEDGE AND INFORMATION SYSTEMS, 2018, 55 (02) :493-528
[19]   Deep Semi-supervised Learning for Time Series Classification [J].
Goschenhofer, Jann ;
Hvingelby, Rasmus ;
Ruegamer, David ;
Thomas, Janek ;
Wagner, Moritz ;
Bischl, Bernd .
20TH IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA 2021), 2021, :422-428
[20]   Knowledge Distillation: A Survey [J].
Gou, Jianping ;
Yu, Baosheng ;
Maybank, Stephen J. ;
Tao, Dacheng .
INTERNATIONAL JOURNAL OF COMPUTER VISION, 2021, 129 (06) :1789-1819