Correlated Deep Q-learning based Microgrid Energy Management

被引:10
|
作者
Zhou, Hao [1 ]
Erol-Kantarci, Melike [1 ]
机构
[1] Univ Ottawa, Sch Elect Engn & Comp Sci, Ottawa, ON, Canada
基金
加拿大自然科学与工程研究理事会;
关键词
Microgrid; energy management; energy trading; deep Q-learning; correlated equilibrium;
D O I
10.1109/camad50429.2020.9209254
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Microgrid (MG) energy management is an important part of MG operation. Various entities are generally involved in the energy management of an MG, e.g., energy storage system (ESS), renewable energy resources (RER) and the load of users, and it is crucial to coordinate these entities. Considering the significant potential of machine learning techniques, this paper proposes a correlated deep Q-learning (CDQN) based technique for the MG energy management. Each electrical entity is modeled as an agent which has a neural network to predict its own Q-values, after which the correlated Q-equilibrium is used to coordinate the operation among agents. In this paper, the Long Short Term Memory networks (LSTM) based deep Qlearning algorithm is introduced and the correlated equilibrium is proposed to coordinate agents. The simulation result shows 40.9% and 9.62% higher profit for ESS agent and photovoltaic (PV) agent, respectively.
引用
收藏
页数:6
相关论文
共 50 条
  • [1] Comparative analysis of Q-learning, SARSA, and deep Q-network for microgrid energy management
    Ramesh, Sreyas
    Sukanth, B. N.
    Sathyavarapu, Sri Jaswanth
    Sharma, Vishwash
    Kumaar, A. A. Nippun
    Khanna, Manju
    SCIENTIFIC REPORTS, 2025, 15 (01):
  • [2] An Online Home Energy Management System using Q-Learning and Deep Q-Learning
    Izmitligil, Hasan
    Karamancioglu, Abdurrahman
    SUSTAINABLE COMPUTING-INFORMATICS & SYSTEMS, 2024, 43
  • [3] Data-Based Optimal Microgrid Management for Energy Trading With Integral Q-Learning Scheme
    Lv, Yongfeng
    Wu, Zhaolong
    Zhao, Xiaowei
    IEEE INTERNET OF THINGS JOURNAL, 2023, 10 (18) : 16183 - 16193
  • [4] Improved Q-learning for Energy Management in a Grid-tied PV Microgrid
    Arwa, Erick O.
    Folly, Komla A.
    SAIEE AFRICA RESEARCH JOURNAL, 2021, 112 (02): : 77 - 88
  • [5] Deep Reinforcement Learning: From Q-Learning to Deep Q-Learning
    Tan, Fuxiao
    Yan, Pengfei
    Guan, Xinping
    NEURAL INFORMATION PROCESSING (ICONIP 2017), PT IV, 2017, 10637 : 475 - 483
  • [6] Energy management based on reinforcement learning with double deep Q-learning for a hybrid electric tracked vehicle
    Han, Xuefeng
    He, Hongwen
    Wu, Jingda
    Peng, Jiankun
    Li, Yuecheng
    APPLIED ENERGY, 2019, 254
  • [7] Energy management strategy for electric vehicles based on deep Q-learning using Bayesian optimization
    Huifang Kong
    Jiapeng Yan
    Hai Wang
    Lei Fan
    Neural Computing and Applications, 2020, 32 : 14431 - 14445
  • [8] Energy management strategy for electric vehicles based on deep Q-learning using Bayesian optimization
    Kong, Huifang
    Yan, Jiapeng
    Wang, Hai
    Fan, Lei
    NEURAL COMPUTING & APPLICATIONS, 2020, 32 (18): : 14431 - 14445
  • [9] Development of a deep Q-learning energy management system for a hybrid electric vehicle
    Tresca L.
    Pulvirenti L.
    Rolando L.
    Millo F.
    Transportation Engineering, 2024, 16
  • [10] Microgrid energy management using deep Q-network reinforcement learning
    Alabdullah, Mohammed H.
    Abido, Mohammad A.
    ALEXANDRIA ENGINEERING JOURNAL, 2022, 61 (11) : 9069 - 9078