DeepTwin: A Deep Reinforcement Learning Supported Digital Twin Model for Micro-Grids

被引:0
作者
Ozkan, Erol [1 ]
Kok, Ibrahim [2 ]
Ozdemir, Suat [1 ]
机构
[1] Hacettepe Univ, Dept Comp Engn, TR-06800 Ankara, Turkiye
[2] Ankara Univ, Dept Artificial Intelligence & Data Engn, TR-06100 Ankara, Turkiye
来源
IEEE ACCESS | 2024年 / 12卷
关键词
Cloud computing; Optimization; Digital twins; Deep reinforcement learning; Data models; Real-time systems; Computational modeling; Mathematical models; Costs; Space heating; Deep reinforcement learning (DRL); digital twin (DT); energy management system (EMS); Internet of Things (IoT); micro-grids; optimization;
D O I
10.1109/ACCESS.2024.3521124
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper presents the development and application of a Digital Twin (DT) model for the optimization of micro-grid operations. With the increasing integration of renewable energy resources (RERs) into power grids, micro-grids are essential for enhancing grid resilience and sustainability. The proposed DT model, enhanced with Deep Reinforcement Learning (DRL), simulates and optimizes key micro-grid functions, such as battery scheduling and load balancing, to improve energy efficiency and reduce operational costs. The model incorporates real-time monitoring, service-oriented simulations, cloud-based deployments, "what-if" analyses, advanced data analytics, and security features to enable comprehensive management of DTs. An optimization scenario was conducted to evaluate the effectiveness of the DT and DRL in improving micro-grid performance. The results demonstrated significant revenue improvements: 81.7% for PPO and 56.12% for SAC compared to the baseline. These findings highlight both the promising potential of DT technology and the critical importance of incorporating DRL techniques into the DTs to improve system performance and resilience.
引用
收藏
页码:196432 / 196441
页数:10
相关论文
共 26 条
  • [1] Kilkis S., Krajacic G., Duic N., Rosen M.A., Al-Nimr M.A., Effective mitigation of climate change with sustainable development of energy, water and environment systems, Energy Conversion and Management, 269, (2022)
  • [2] Cetin M., Urkan O.D., Hekim M., Cetin E., Power generation prediction of a geothermal-thermoelectric hybrid system using intelligent models, Geothermics, 118, (2024)
  • [3] Pigott A., Crozier C., Baker K., Nagy Z., Gridlearn: Multiagent reinforcement learning for grid-aware building energy management, (2021)
  • [4] Nakabi T.A., Toivanen P., Deep reinforcement learning for energy management in a microgrid with flexible demand, Sustainable Energy, Grids and Networks, 25, (2021)
  • [5] Nweye K., Liu B., Stone P., Nagy Z., Real-world challenges for multi-agent reinforcement learning in grid-interactive buildings, Energy and AI, 10, (2022)
  • [6] Jafari M., Kavousi-Fard A., Chen T., Karimi M., A review on digital twin technology in smart grid, transportation system and smart city: Challenges and future, IEEE Access, PP, (2023)
  • [7] Bazmohammadi N., Madary A., Vasquez J., Bazmohammadi H., Khan B., Wu Y., Guerrero J., Microgrid digital twins: Concepts, applications, and future trends, IEEE Access, PP, (2021)
  • [8] Arwa E., Folly K., Reinforcement learning techniques for optimal power control in grid-connected microgrids: A comprehensive review, IEEE Access, 8, pp. 1-16, (2020)
  • [9] Gao J., Huang H., Stochastic optimization for energy economics and renewable sources management: A case study of solar energy in digital twin, Solar Energy, 262, (2023)
  • [10] Tomin I., Korzhavin P., Vorobev P., Reinforcement learning in digital twins for active network management of power grids, E3S Web of Conferences, 209, (2020)