TATCN: Time Series Prediction Model Based on Time Attention Mechanism and TCN

被引:9
作者
Wang, Hao [1 ]
Zhang, Zhenguo [1 ]
机构
[1] Yanbian Univ, Dept Comp Sci & Technol, 977 Gongyuan Rd, Yanji 133002, Peoples R China
来源
2022 IEEE 2ND INTERNATIONAL CONFERENCE ON COMPUTER COMMUNICATION AND ARTIFICIAL INTELLIGENCE (CCAI 2022) | 2022年
基金
中国国家自然科学基金;
关键词
TCN; attention mechanism; time series; prediction;
D O I
10.1109/CCAI55564.2022.9807714
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Prediction is an important research task of time series data analysis. As a powerful tool to solve the problem of time series prediction, Temporal Convolutional Networks (TCN) shows good performance in the prediction task. However, TCN model lacks the consideration of the influence of different historical segments on the prediction value, which limits the prediction accuracy of the model to a certain extent. Therefore, this paper combines the attention mechanism with the data characteristics of time series, proposes a Time Attention mechanism (TA), and integrates it into the TCN model framework to build a prediction model (called TATCN). In TATCN, the TCN output vector of each layer is convoluted, and the sigmoid function is used to generate the weight coefficient, and then the weight coefficient is multiplied by the original output vector to form a new output vector. The new output vector and the input vector of the current layer are added by residual connection as the final output vector of the current layer and input to the next layer network. The experimental results on EEG data and Yanbian electricity fees data show that the Time Attention mechanism in this paper can effectively represent the importance of different historical data to the current prediction. The proposed TATCN model has a significant improvement in the prediction accuracy compared with TCN model, and is also better than RNN prediction models such as LSTM and GRU.
引用
收藏
页码:26 / 31
页数:6
相关论文
共 16 条
  • [1] [Anonymous], 2011, P 28 INT C MACHINE L
  • [2] Bai Shaojie, 2018, INT C LEARN REPR
  • [3] Bianchi F. M., 2017, RECURRENT NEURAL NET
  • [4] Piecewise linear correction of ECG baseline wander: a curve simplification approach
    Boucheham, B
    Ferdi, Y
    Batouche, MC
    [J]. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE, 2005, 78 (01) : 1 - 10
  • [5] Chung J, 2014, CORR, P1
  • [6] Cinar Y G, INT C NEURAL INFORM
  • [7] Scalable Forecasting Techniques Applied to Big Electricity Time Series
    Galicia, Antonio
    Torres, Jose F.
    Martinez-Alvarez, Francisco
    Troncoso, Alicia
    [J]. ADVANCES IN COMPUTATIONAL INTELLIGENCE, IWANN 2017, PT II, 2017, 10306 : 165 - 175
  • [8] A review of unsupervised feature learning and deep learning for time-series modeling
    Langkvist, Martin
    Karlsson, Lars
    Loutfi, Amy
    [J]. PATTERN RECOGNITION LETTERS, 2014, 42 : 11 - 24
  • [9] Temporal Convolutional Networks for Action Segmentation and Detection
    Lea, Colin
    Flynn, Michael D.
    Vidal, Rene
    Reiter, Austin
    Hager, Gregory D.
    [J]. 30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, : 1003 - 1012
  • [10] Liu C., 2016, ONLINE ARIMA ALGORIT