OnsitNet: A memory-capable online time series forecasting model incorporating a self-attention mechanism

被引:1
作者
Liu, Hui [1 ,2 ,3 ]
Wang, Zhengkai [1 ]
Dong, Xiyao [1 ]
Du, Junzhao [1 ,2 ,3 ]
机构
[1] Xidian Univ, Sch Comp Sci & Technol, Xian 710126, Peoples R China
[2] Minist Educ, Engn Res Ctr Blockchain Technol Applicat & Evaluat, Xian 710126, Peoples R China
[3] Key Lab Smart Human Comp Interact & Wearable Techn, Xian 710126, Peoples R China
基金
中国国家自然科学基金;
关键词
Time series forecasting; Online learning; Offline learning; Self-attention mechanism; ITransformer;
D O I
10.1016/j.eswa.2024.125231
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Traditional time series (TS) forecasting models are based on fixed, static datasets and lack scalability when faced with the continuous influx of data in real-world scenarios. Real-time online learning of data streams is crucial for improving forecasting efficiency. However, few studies focus on online TS forecasting, and existing approaches have several limitations. Most current online TS forecasting models merely train on data streams and are ineffective in handling concept drift scenarios. Furthermore, they often fail to adequately consider dependencies between variables and do not leverage the robust modeling capabilities of offline models. Therefore, we propose an innovative online learning method called OnsitNet. It consists of multiple learning modules that progressively expand the receptive field of convolutional kernels within the learning modules using an exponentially growing dilation factor, aiding in the capture of multi-scale data features. Within the learning modules, we propose an online learning strategy focusing on memorizing concept drift scenarios, with a fast learner, memorizer, and Pearson trigger. The Pearson trigger activates dynamic interaction between the fast learner and memorizer by detecting new data patterns, facilitating online rapid learning of data streams. To capture the dependencies between variables, we propose a new model, SITransformer, which is a streamlined version of the offline model ITransformer. Unlike the traditional Transformer, it reverses the roles of the feed-forward network and the attention mechanism. This inverted architecture is more effective at learning the correlations between variables. Experimental results on five real-world datasets show OnsitNet achieves lower online prediction errors, enabling timely and effective forecasting of future trends in TS data.
引用
收藏
页数:13
相关论文
共 50 条
  • [31] Pedestrian Trajectory Prediction Model Based on Self-Attention Mechanism and Group Behavior Characteristics
    Zhou Y.
    Wu H.
    Cheng H.
    Zheng J.
    Li X.
    Wuhan Daxue Xuebao (Xinxi Kexue Ban)/Geomatics and Information Science of Wuhan University, 2020, 45 (12): : 1989 - 1996
  • [32] Cruise Speed Model Based on Self-Attention Mechanism for Autonomous Underwater Vehicle Navigation
    Mu, Xiaokai
    Yi, Yuanhang
    Zhu, Zhongben
    Zhu, Lili
    Wang, Zhuo
    Qin, Hongde
    REMOTE SENSING, 2024, 16 (14)
  • [33] Efficient self-attention mechanism and structural distilling model for Alzheimer's disease diagnosis
    Zhu, Jiayi
    Tan, Ying
    Lin, Rude
    Miao, Jiaqing
    Fan, Xuwei
    Zhu, Yafei
    Liang, Ping
    Gong, Jinnan
    He, Hui
    COMPUTERS IN BIOLOGY AND MEDICINE, 2022, 147
  • [34] Sparse spatial-temporal attention forecasting network: A new model for time series forecasting
    Wen, Mi
    Huan, Junjie
    Wei, Minjie
    Su, Yun
    Guo, Naiwang
    INTERNATIONAL JOURNAL OF MODERN PHYSICS C, 2024,
  • [35] Application of an end-to-end model with self-attention mechanism in cardiac disease prediction
    Li, Li
    Chen, Xi
    Hu, Sanjun
    FRONTIERS IN PHYSIOLOGY, 2024, 14
  • [36] A GAN Model With Self-attention Mechanism To Generate Multi-instruments Symbolic Music
    Guan, Faqian
    Yu, Chunyan
    Yang, Suqiong
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [37] Sequential Prediction of Glycosylated Hemoglobin Based on Long Short-Term Memory with Self-Attention Mechanism
    Wang, Xiaojia
    Gong, Wenqing
    Zhu, Keyu
    Yao, Lushi
    Zhang, Shanshan
    Xu, Weiqun
    Guan, Yuxiang
    INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE SYSTEMS, 2020, 13 (01) : 1578 - 1589
  • [38] Sequential Prediction of Glycosylated Hemoglobin Based on Long Short-Term Memory with Self-Attention Mechanism
    Xiaojia Wang
    Wenqing Gong
    Keyu Zhu
    Lushi Yao
    Shanshan Zhang
    Weiqun Xu
    Yuxiang Guan
    International Journal of Computational Intelligence Systems, 2020, 13 : 1578 - 1589
  • [39] Oil well production prediction based on CNN-LSTM model with self-attention mechanism
    Pan, Shaowei
    Yang, Bo
    Wang, Shukai
    Guo, Zhi
    Wang, Lin
    Liu, Jinhua
    Wu, Siyu
    ENERGY, 2023, 284
  • [40] FST-OAM: a fast style transfer model using optimized self-attention mechanism
    Du, Xiaozhi
    Jia, Ning
    Du, Hongyuan
    SIGNAL IMAGE AND VIDEO PROCESSING, 2024, 18 (05) : 4191 - 4203