Long-Short-Term Memory Based on Adaptive Convolutional Network for Time Series Classification

被引:3
作者
Li, Yujuan [1 ]
Wu, Yonghong [1 ]
机构
[1] Wuhan Univ Technol, Sch Sci, Wuhan 430070, Peoples R China
基金
中国国家自然科学基金;
关键词
Time series classification; Adaptive selection; Long short-term memory; Attention mechanism; Deep neural network; ATTENTION;
D O I
10.1007/s11063-023-11148-w
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep learning technology is effective to solve time series classification tasks. The existing deep learning algorithms with fixed step convolution cannot effectively extract and focus on multi-scale features. Based on the complexity and long-term dependence of time series data, an end-to-end model, called as Adaptive Convolutional Network Long-Short-Term Memory (ACN-LSTM), is proposed in this paper. This network is composed of two branches: long-short-term memory (LSTM) and adaptive convolution neural network (ACN). To control the transmission of sequence information, fully extract the correlation information of time series, and enhance the discriminative power of the network, LSTM uses memory cells and gate mechanism. ACN obtains local features of time series by stacking one-dimensional convolutional neural block (Conv1D) and then the multi-scale convolutional neural block is used to capture different scales of information. Meanwhile, to adaptively adjust the feature information between layers, an inter-layer adaptive channel feature adjustment mechanism (ACFM) is proposed. ACN-LSTM not only fully extracts long-term time correlation information, but also enables neurons to adaptively adjust their receptive field sizes, thus, it obtains more accurate classification results. The experiment results on 65 UCR standard datasets show that the proposed ACN-LSTM achieves highest arithmetic and geometric mean rank, and the lowest mean error, which are 2.492 and 2.108, and 0.127, respectively, compared with other models, which indicates that it is effective in univariate time series classification.
引用
收藏
页码:6547 / 6569
页数:23
相关论文
共 36 条
  • [1] [Anonymous], 2016, IEEE C COMPUT UIS PA, DOI DOI 10.1109/CVPR.2016.319
  • [2] [Anonymous], 1985, Learning internal representations by error propagation, DOI 10.1016/b978-1-4832-1446-7.50035-2
  • [3] [Anonymous], 2010, ACM Sigkdd Explorations Newsletter, DOI DOI 10.1145/1882471.1882478
  • [4] The great time series classification bake off: a review and experimental evaluation of recent algorithmic advances
    Bagnall, Anthony
    Lines, Jason
    Bostrom, Aaron
    Large, James
    Keogh, Eamonn
    [J]. DATA MINING AND KNOWLEDGE DISCOVERY, 2017, 31 (03) : 606 - 660
  • [5] Time-Series Classification with COTE: The Collective of Transformation-Based Ensembles
    Bagnall, Anthony
    Lines, Jason
    Hills, Jon
    Bostrom, Aaron
    [J]. IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2015, 27 (09) : 2522 - 2535
  • [6] A Bag-of-Features Framework to Classify Time Series
    Baydogan, Mustafa Gokce
    Runger, George
    Tuv, Eugene
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2013, 35 (11) : 2796 - 2802
  • [7] Spatial attention alters visual appearance
    Carrasco, Marisa
    Barbot, Antoine
    [J]. CURRENT OPINION IN PSYCHOLOGY, 2019, 29 : 56 - 64
  • [8] Multi-scale Attention Convolutional Neural Network for time series classification
    Chen, Wei
    Shi, Ke
    [J]. NEURAL NETWORKS, 2021, 136 (136) : 126 - 140
  • [9] Image-based thickener mud layer height prediction with attention mechanism-based CNN
    Fang, Chenyu
    He, Dakuo
    Li, Kang
    Liu, Yan
    Wang, Fuli
    [J]. ISA TRANSACTIONS, 2022, 128 : 677 - 689
  • [10] InceptionTime: Finding AlexNet for time series classification
    Fawaz, Hassan Ismail
    Lucas, Benjamin
    Forestier, Germain
    Pelletier, Charlotte
    Schmidt, Daniel F.
    Weber, Jonathan
    Webb, Geoffrey, I
    Idoumghar, Lhassane
    Muller, Pierre-Alain
    Petitjean, Francois
    [J]. DATA MINING AND KNOWLEDGE DISCOVERY, 2020, 34 (06) : 1936 - 1962