Improving LSTM forecasting through ensemble learning: a comparative analysis of various models

被引:0
|
作者
Zishan Ahmad [1 ]
Vengadeswaran Shanmugasundaram [1 ]
Rashid Biju [2 ]
undefined Khan [2 ]
机构
[1] Department of Computer Science and Engineering, Indian Institute of Information Technology, Kerala, Kottayam
[2] LANOVIZ Security Solutions, Kerala, Ernakulam
关键词
ARIMA; BLSTM; Ensemble learning; GRU; LSTM; RNN; SARIMA;
D O I
10.1007/s41870-024-02157-6
中图分类号
学科分类号
摘要
Supply chain management involves managing the entire manufacturing process, from purchasing supplies to delivering the final product. Demand forecasting helps businesses predict future customer demand by analyzing historical data and market patterns. While various papers discuss optimizing models, this research compares several machine learning models, such as ARIMA, SARIMA, and deep learning models like RNN, LSTM, GRU, and BLSTM. It also extends to approaches like ensemble learning with the LSTM model, discussing how ensemble learning can further improve the LSTM model. This paper explores ensemble learning in two ways: a) without model pruning, averaging all generated models, and b) with model pruning, removing underperforming models and averaging top performers. Experiments conducted on a public dataset from the University of Chicago achieved a very low RMSE loss of 9.26 on the LSTM model improved via ensemble learning with model pruning. This ensemble approach with model pruning improved accuracy in predicting future customer demand, and a complete pipeline integrating visualization and a notification system was developed. © Bharati Vidyapeeth's Institute of Computer Applications and Management 2024.
引用
收藏
页码:5113 / 5131
页数:18
相关论文
共 50 条
  • [41] Improving Sentiment Analysis of Moroccan Tweets Using Ensemble Learning
    Oussous, Ahmed
    Ait Lahcen, Ayoub
    Belfkih, Samir
    BIG DATA, CLOUD AND APPLICATIONS, BDCA 2018, 2018, 872 : 91 - 104
  • [42] Short-Term Traffic Forecasting using LSTM-based Deep Learning Models
    Haputhanthri, Dilantha
    Wijayasiri, Adeesha
    MORATUWA ENGINEERING RESEARCH CONFERENCE (MERCON 2021) / 7TH INTERNATIONAL MULTIDISCIPLINARY ENGINEERING RESEARCH CONFERENCE, 2021, : 602 - 607
  • [43] Comparative Analysis of Machine Learning, Hybrid, and Deep Learning Forecasting Models: Evidence from European Financial Markets and Bitcoins
    Ampountolas, Apostolos
    FORECASTING, 2023, 5 (02): : 472 - 486
  • [44] Application of Ensemble Learning Techniques in Improving Accuracy and Robustness of Medical Classification Models
    Yang, Ruiyao
    PROCEEDINGS OF 2023 4TH INTERNATIONAL SYMPOSIUM ON ARTIFICIAL INTELLIGENCE FOR MEDICINE SCIENCE, ISAIMS 2023, 2023, : 1206 - 1211
  • [45] Improving Hybrid Models for Precipitation Forecasting by Combining Nonlinear Machine Learning Methods
    Parviz, Laleh
    Rasouli, Kabir
    Torabi Haghighi, Ali
    WATER RESOURCES MANAGEMENT, 2023, 37 (10) : 3833 - 3855
  • [46] Sensor Drift Compensation Based on the Improved LSTM and SVM Multi-Class Ensemble Learning Models
    Zhao, Xia
    Li, Pengfei
    Xiao, Kaitai
    Meng, Xiangning
    Han, Lu
    Yu, Chongchong
    SENSORS, 2019, 19 (18)
  • [47] Deep Learning for Speaker Recognition: A Comparative Analysis of 1D-CNN and LSTM Models Using Diverse Datasets
    Hassanzadeh, Hiwa
    Qadir, Jihad Anwar
    Omer, Saman Muhammad
    Ahmed, Mohammed Hussein
    Khezri, Edris
    4TH INTERDISCIPLINARY CONFERENCE ON ELECTRICS AND COMPUTER, INTCEC 2024, 2024,
  • [48] Heavy metal adsorption efficiency prediction using biochar properties: a comparative analysis for ensemble machine learning models
    Zaher Mundher Yaseen
    Farah Loui Alhalimi
    Scientific Reports, 15 (1)
  • [49] Demand Forecasting in Python']Python: Deep Learning Model Based on LSTM Architecture versus Statistical Models
    Kolkova, Andrea
    Navratil, Miroslav
    ACTA POLYTECHNICA HUNGARICA, 2021, 18 (08) : 123 - 141
  • [50] A Comparative Analysis of Traditional and Machine Learning Methods in Forecasting the Stock Markets of China and the US
    Jin S.
    Jin, Shangshang, 1600, Science and Information Organization (15): : 1 - 8