Long-Short-Term Memory Based on Adaptive Convolutional Network for Time Series Classification

被引:4
作者
Li, Yujuan [1 ]
Wu, Yonghong [1 ]
机构
[1] Wuhan Univ Technol, Sch Sci, Wuhan 430070, Peoples R China
基金
中国国家自然科学基金;
关键词
Time series classification; Adaptive selection; Long short-term memory; Attention mechanism; Deep neural network; ATTENTION;
D O I
10.1007/s11063-023-11148-w
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep learning technology is effective to solve time series classification tasks. The existing deep learning algorithms with fixed step convolution cannot effectively extract and focus on multi-scale features. Based on the complexity and long-term dependence of time series data, an end-to-end model, called as Adaptive Convolutional Network Long-Short-Term Memory (ACN-LSTM), is proposed in this paper. This network is composed of two branches: long-short-term memory (LSTM) and adaptive convolution neural network (ACN). To control the transmission of sequence information, fully extract the correlation information of time series, and enhance the discriminative power of the network, LSTM uses memory cells and gate mechanism. ACN obtains local features of time series by stacking one-dimensional convolutional neural block (Conv1D) and then the multi-scale convolutional neural block is used to capture different scales of information. Meanwhile, to adaptively adjust the feature information between layers, an inter-layer adaptive channel feature adjustment mechanism (ACFM) is proposed. ACN-LSTM not only fully extracts long-term time correlation information, but also enables neurons to adaptively adjust their receptive field sizes, thus, it obtains more accurate classification results. The experiment results on 65 UCR standard datasets show that the proposed ACN-LSTM achieves highest arithmetic and geometric mean rank, and the lowest mean error, which are 2.492 and 2.108, and 0.127, respectively, compared with other models, which indicates that it is effective in univariate time series classification.
引用
收藏
页码:6547 / 6569
页数:23
相关论文
共 36 条
[1]  
[Anonymous], 2010, ACM Sigkdd Explorations Newsletter
[2]   The great time series classification bake off: a review and experimental evaluation of recent algorithmic advances [J].
Bagnall, Anthony ;
Lines, Jason ;
Bostrom, Aaron ;
Large, James ;
Keogh, Eamonn .
DATA MINING AND KNOWLEDGE DISCOVERY, 2017, 31 (03) :606-660
[3]   Time-Series Classification with COTE: The Collective of Transformation-Based Ensembles [J].
Bagnall, Anthony ;
Lines, Jason ;
Hills, Jon ;
Bostrom, Aaron .
IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2015, 27 (09) :2522-2535
[4]   A Bag-of-Features Framework to Classify Time Series [J].
Baydogan, Mustafa Gokce ;
Runger, George ;
Tuv, Eugene .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2013, 35 (11) :2796-2802
[5]   Spatial attention alters visual appearance [J].
Carrasco, Marisa ;
Barbot, Antoine .
CURRENT OPINION IN PSYCHOLOGY, 2019, 29 :56-64
[6]   Multi-scale Attention Convolutional Neural Network for time series classification [J].
Chen, Wei ;
Shi, Ke .
NEURAL NETWORKS, 2021, 136 (136) :126-140
[7]   Image-based thickener mud layer height prediction with attention mechanism-based CNN [J].
Fang, Chenyu ;
He, Dakuo ;
Li, Kang ;
Liu, Yan ;
Wang, Fuli .
ISA TRANSACTIONS, 2022, 128 :677-689
[8]   InceptionTime: Finding AlexNet for time series classification [J].
Fawaz, Hassan Ismail ;
Lucas, Benjamin ;
Forestier, Germain ;
Pelletier, Charlotte ;
Schmidt, Daniel F. ;
Weber, Jonathan ;
Webb, Geoffrey, I ;
Idoumghar, Lhassane ;
Muller, Pierre-Alain ;
Petitjean, Francois .
DATA MINING AND KNOWLEDGE DISCOVERY, 2020, 34 (06) :1936-1962
[9]   Deep learning for time series classification: a review [J].
Fawaz, Hassan Ismail ;
Forestier, Germain ;
Weber, Jonathan ;
Idoumghar, Lhassane ;
Muller, Pierre-Alain .
DATA MINING AND KNOWLEDGE DISCOVERY, 2019, 33 (04) :917-963
[10]  
Gulli A., 2017, DEEP LEARNING KERAS, P15