Transformer-Based Water Quality Forecasting With Dual Patch and Trend Decomposition

被引:0
|
作者
Lin, Yongze [1 ,2 ]
Qiao, Junfei [1 ,2 ]
Bi, Jing [2 ,3 ]
Yuan, Haitao [4 ]
Wang, Mengyuan [5 ]
Zhang, Jia [6 ]
Zhou, MengChu [7 ]
机构
[1] Beijing Univ Technol, Sch Informat Sci & Technol, Beijing 100124, Peoples R China
[2] Beijing Univ Technol, Beijing Lab Smart Environm Protect, Beijing 100124, Peoples R China
[3] Beijing Univ Technol, Coll Comp Sci, Beijing 100124, Peoples R China
[4] Beihang Univ, Sch Automat Sci & Elect Engn, Beijing 100191, Peoples R China
[5] Beihang Univ, Sch Energy & Power Engn, Beijing 100191, Peoples R China
[6] Southern Methodist Univ, Dept Comp Sci, Dallas, TX 75206 USA
[7] New Jersey Inst Technol, Dept Elect & Comp Engn, Newark, NJ 07102 USA
来源
IEEE INTERNET OF THINGS JOURNAL | 2025年 / 12卷 / 08期
基金
北京市自然科学基金; 中国国家自然科学基金;
关键词
Time series analysis; Market research; Computational modeling; Predictive models; Water quality; Noise; Transformers; Accuracy; Long short term memory; Semantics; Savitsky-Golay (SG) filter; self-supervised learning; transformer; trend decomposition; water quality time series prediction; NEURAL-NETWORKS;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In many fields, time series prediction is gaining more and more attention, e.g., air pollution, geological hazards, and network traffic prediction. Water quality prediction uses historical data to predict future water quality. However, it is difficult to learn a representation map from a time series that captures the trends and fluctuations to effectively remove noise from the time series data and investigate complex nonlinear relationships. To solve these problems, this work proposes a time series prediction model, called DPSGT for short, which integrates Dual Patch Savitsky-Golay filtering and Transformer. First, DPSGT adopts the SG filtering to decompose the time series data and reduce the noise interference to improve long-term prediction capabilities. Second, to tackle the limitation of temporal representation capability, DPSGT adopts dual patches to ravel temporal series into local and global patches, which can tackle local semantic information and enlarge the receptive field. Third, it utilizes a transformer mechanism to address the nonlinear problem of the water quality time series and improve the accuracy of the prediction. Two real-world datasets are utilized to evaluate the proposed DPSGT, and experiments prove that DPSGT improves root mean-square error (RMSE), mean absolute error (MAE), mean absolute percentage error (MAPE), and R-2 by 6%, 5%, 8%, and 7%, respectively, compared with other benchmark models.
引用
收藏
页码:10987 / 10997
页数:11
相关论文
共 50 条
  • [21] Multi-scale patch transformer with adaptive decomposition for carbon emissions forecasting
    Li, Xiang
    Chu, Lei
    Li, Yujun
    Ding, Fengqian
    Quan, Zhenzhen
    Qu, Fangx
    Xing, Zhanjun
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2025, 146
  • [22] Rankformer: Leveraging Rank Correlation for Transformer-based Time Series Forecasting
    Ouyang, Zuokun
    Jabloun, Meryem
    Ravier, Philippe
    2023 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP, SSP, 2023, : 85 - 89
  • [23] Transformer-Based Models for Probabilistic Time Series Forecasting with Explanatory Variables
    Caetano, Ricardo
    Oliveira, Jose Manuel
    Ramos, Patricia
    MATHEMATICS, 2025, 13 (05)
  • [24] Temporal Saliency Detection Towards Explainable Transformer-Based Timeseries Forecasting
    Nghia Duong-Trung
    Duc-Manh Nguyen
    Danh Le-Phuoc
    ARTIFICIAL INTELLIGENCE-ECAI 2023 INTERNATIONAL WORKSHOPS, PT 1, XAI3, TACTIFUL, XI-ML, SEDAMI, RAAIT, AI4S, HYDRA, AI4AI, 2023, 2024, 1947 : 250 - 268
  • [25] Foreformer: an enhanced transformer-based framework for multivariate time series forecasting
    Ye Yang
    Jiangang Lu
    Applied Intelligence, 2023, 53 : 12521 - 12540
  • [26] GRU- and Transformer-Based Periodicity Fusion Network for Traffic Forecasting
    Zhang, Yazhe
    Liu, Shixuan
    Zhang, Ping
    Li, Bo
    ELECTRONICS, 2023, 12 (24)
  • [27] Enhanced Linear and Vision Transformer-Based Architectures for Time Series Forecasting
    Alharthi, Musleh
    Mahmood, Ausif
    BIG DATA AND COGNITIVE COMPUTING, 2024, 8 (05)
  • [28] Foreformer: an enhanced transformer-based framework for multivariate time series forecasting
    Yang, Ye
    Lu, Jiangang
    APPLIED INTELLIGENCE, 2023, 53 (10) : 12521 - 12540
  • [29] A systematic review for transformer-based long-term series forecasting
    Su, Liyilei
    Zuo, Xumin
    Li, Rui
    Wang, Xin
    Zhao, Heng
    Huang, Bingding
    ARTIFICIAL INTELLIGENCE REVIEW, 2025, 58 (03)
  • [30] Efficient Transformer-Based Compressed Video Modeling via Informative Patch Selection
    Suzuki, Tomoyuki
    Aoki, Yoshimitsu
    SENSORS, 2023, 23 (01)