Granular neural networks learning for time series prediction under a federated scenario

被引:0
|
作者
Song M. [1 ]
Zhao X. [1 ]
机构
[1] School of Computer and Cyber Sciences, Communication University of China, Beijing
基金
中国国家自然科学基金;
关键词
Federated learning (FL); Granular neural networks (GNN); PSO; Time series prediction;
D O I
10.1007/s41066-024-00490-6
中图分类号
学科分类号
摘要
Granular neural networks (GNNs) are a type of advanced prediction models that produce information granules, offer more abstract and adaptable results. In this study, we address three significant issues in time series prediction within a federated learning (FL) scenario: the management of distributed data, the aggregation of GNNs, and the optimization of granularity levels. Traditional centralized models are insufficient for managing distributed data while ensuring privacy and reducing communication costs, and existing studies on GNNs have not explored their aggregation under a federated framework, which is essential for enhancing model robustness and stability. Additionally, determining the optimal level of granularity for GNNs remains a challenge, impacting the model's predictive accuracy and computational efficiency. To address these issues, we propose a novel federated learning framework that enhances the performance of GNNs for time series prediction. Our approach involves a comprehensive FL framework that enables the collaborative training of local GNNs, refining their granular weights through global aggregation, ensuring better privacy management, and reducing communication overhead. By focusing on the aggregation of parameters within the federated scenario, we enhance the robustness and stability of GNNs which are crucial for effective time series prediction. Furthermore, we determine the optimal levels of information granularity by employing multi-objective optimization techniques, specifically using Pareto fronts to balance the trade-offs between different objectives. Experiments on predicting air quality index for 35 stations in Beijing (China) show the effectiveness of our method. © The Author(s), under exclusive licence to Springer Nature Switzerland AG 2024.
引用
收藏
相关论文
共 50 条
  • [1] Time series prediction with granular neural networks
    Song, Mingli
    Li, Yan
    Pedrycz, Witold
    NEUROCOMPUTING, 2023, 546
  • [2] Prediction of time series by a structural learning of neural networks
    Ishikawa, M
    Moriyama, T
    FUZZY SETS AND SYSTEMS, 1996, 82 (02) : 167 - 176
  • [3] Time series prediction and neural networks
    Frank, RJ
    Davey, N
    Hunt, SP
    JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS, 2001, 31 (1-3) : 91 - 103
  • [4] Time Series Prediction and Neural Networks
    R. J. Frank
    N. Davey
    S. P. Hunt
    Journal of Intelligent and Robotic Systems, 2001, 31 : 91 - 103
  • [5] Time series prediction with wavelet neural networks
    Cristea, P
    Tuduce, R
    Cristea, A
    NEUREL 2000: PROCEEDINGS OF THE 5TH SEMINAR ON NEURAL NETWORK APPLICATIONS IN ELECTRICAL ENGINEERING, 2000, : 7 - 10
  • [6] Time Series Prediction with Neural Networks: a Review
    Shterev, Vasil A.
    Metchkarski, Nikolay S.
    Koparanov, Kiril A.
    2022 57TH INTERNATIONAL SCIENTIFIC CONFERENCE ON INFORMATION, COMMUNICATION AND ENERGY SYSTEMS AND TECHNOLOGIES (ICEST), 2022, : 201 - 204
  • [7] A granular recurrent neural network for multiple time series prediction
    Tomasiello, Stefania
    Loia, Vincenzo
    Khaliq, Abdul
    NEURAL COMPUTING & APPLICATIONS, 2021, 33 (16): : 10293 - 10310
  • [8] A granular recurrent neural network for multiple time series prediction
    Stefania Tomasiello
    Vincenzo Loia
    Abdul Khaliq
    Neural Computing and Applications, 2021, 33 : 10293 - 10310
  • [9] Learning and predicting time series by neural networks
    Freking, Ansgar
    Kinzel, Wolfgang
    Kanter, Ido
    Physical Review E - Statistical, Nonlinear, and Soft Matter Physics, 2002, 65 (05): : 1 - 050903
  • [10] Learning and predicting time series by neural networks
    Freking, A
    Kinzel, W
    Kanter, I
    PHYSICAL REVIEW E, 2002, 65 (05):