FL-Net: A multi-scale cross-decomposition network with frequency external attention for long-term time series forecasting

被引:7
作者
Huang, Siyuan [1 ]
Liu, Yepeng [1 ,2 ]
机构
[1] Shandong Technol & Business Univ, Sch Comp Sci & Technol, Yantai 264005, Peoples R China
[2] Shandong Future Intelligent Financial Engn Lab, Yantai 264005, Peoples R China
关键词
Long-term time series forecasting (LTSF); Transformer-based methods; Frequency external attention (FEA); Multi-scale cross-decomposition (MSCD); NEURAL-NETWORK;
D O I
10.1016/j.knosys.2024.111473
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Many real -world applications, such as energy consumption alerts and long-term traffic planning, require the prediction of data changes over extended horizons. Long-term time series forecasting (LTSF) demands methods with robust prediction capabilities. Recently, transformer -based methods have shown immense potential for LTSF. However, these methods rely heavily on positional encoding to maintain temporal information, inevitably leading to the loss of temporal patterns. Moreover, various self -attention mechanisms often capture only intrasequence features, requiring a greater ability to capture intersequence features. The quadratic complexities of time and memory render training challenging when dealing with lengthy input sequences. We propose FL -Net to enhance the accuracy of LTSF. To capture the seasonal and trend components in the time series accurately, FL -Net segments the input sequences into coarse -grained trends and seasonal components using moving averages. Two sets of encoders extract the temporal features from these components. Our proposed frequency external attention method utilizes two external, compact, learnable, and shared memories to learn and store the time- and frequency -domain features of the entire training set, demonstrating high efficiency in terms of linear complexity. Simultaneously, we propose a multi -scale cross -decomposition method that further decomposes trends and seasonal elements into finer -grained components, thereby enhancing the capability of the model to extract temporal features. Experimental results on nine real -world benchmark datasets demonstrate that FL -Net achieves higher prediction accuracy in long-term forecasting than state-of-the-art methods.
引用
收藏
页数:13
相关论文
共 40 条
  • [1] Stock Price Prediction Using the ARIMA Model
    Adebiyi, Ayodele A.
    Adewumi, Aderemi O.
    Ayo, Charles K.
    [J]. 2014 UKSIM-AMSS 16TH INTERNATIONAL CONFERENCE ON COMPUTER MODELLING AND SIMULATION (UKSIM), 2014, : 106 - 112
  • [2] A review on applications of ANN and SVM for building electrical energy consumption forecasting
    Ahmad, A. S.
    Hassan, M. Y.
    Abdullah, M. P.
    Rahman, H. A.
    Hussin, F.
    Abdullah, H.
    Saidur, R.
    [J]. RENEWABLE & SUSTAINABLE ENERGY REVIEWS, 2014, 33 : 102 - 109
  • [3] The challenges of modeling and forecasting the spread of COVID-19
    Bertozzi, Andrea L.
    Franco, Elisa
    Mohler, George
    Short, Martin B.
    Sledge, Daniel
    [J]. PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2020, 117 (29) : 16732 - 16738
  • [4] Audiovisual Transformer Architectures for Large-Scale Classification and Synchronization of Weakly Labeled Audio Events
    Boes, Wim
    Van Hamme, Hugo
    [J]. PROCEEDINGS OF THE 27TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA (MM'19), 2019, : 1961 - 1969
  • [5] Cirstea RG, 2022, TRIFORMER TRIANGULAR
  • [6] Dosovitskiy A., 2020, INT C LEARNING REPRE
  • [7] Economic forecasting
    Elliott, Graham
    Timmermann, Allan
    [J]. JOURNAL OF ECONOMIC LITERATURE, 2008, 46 (01) : 3 - 56
  • [8] Exponential smoothing: The state of the art - Part II
    Gardner, Everette S., Jr.
    [J]. INTERNATIONAL JOURNAL OF FORECASTING, 2006, 22 (04) : 637 - 666
  • [9] Graves A, 2012, STUD COMPUT INTELL, V385, P1, DOI [10.1007/978-3-642-24797-2, 10.1162/neco.1997.9.1.1]
  • [10] Beyond Self-Attention: External Attention Using Two Linear Layers for Visual Tasks
    Guo, Meng-Hao
    Liu, Zheng-Ning
    Mu, Tai-Jiang
    Hu, Shi-Min
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (05) : 5436 - 5447