CuEMS: Deep reinforcement learning for community control of energy management systems in microgrids

被引:4
作者
Li, Jianbin [1 ]
Jiang, Zeshuo [1 ]
Chen, Zhiqiang [2 ,3 ]
Liu, Jinwei [4 ]
Cheng, Long [1 ]
机构
[1] North China Elect Power Univ, Sch Control & Comp Engn, Beijing 102206, Peoples R China
[2] Eastern Inst Technol, Ningbo 315200, Peoples R China
[3] Ningbo Inst Digital Twin, Ningbo 315201, Peoples R China
[4] Florida A&M Univ, Dept Comp & Informat Sci, Tallahassee, FL 32307 USA
关键词
Deep reinforcement learning; Community microgrid; Energy management; User comfort; Energy storage system; DEMAND RESPONSE; STRATEGY; LOADS;
D O I
10.1016/j.enbuild.2023.113865
中图分类号
TU [建筑科学];
学科分类号
0813 ;
摘要
Nowadays, energy management assumes a pivotal role in ensuring the stable operation of microgrids, facilitating enhanced monitoring, analysis, and optimization of energy utilization for end-users. The implementation of an energy management system (EMS) offers invaluable assistance to microgrid users, enabling them to cut electricity costs, ensure reliability, and provide an elevated level of comfort. However, as the scope of microgrid management proliferates, balancing between the comfort of community microgrid users with more devices and their profitability becomes increasingly challenging. To address this issue, this paper proposes a novel community control approach for EMS, called CuEMS, that utilizes deep reinforcement learning (DRL) to manage multiple appliances and energy storage systems (ESSs). The proposed approach optimizes energy profitability for operators while taking into account user comfort. It reformulates energy scheduling as a Markov decision process and learns the optimal scheduling strategy through deep Q-networks. The presented CuEMS approach is evaluated using real world data from Finland, and the results demonstrate that it can optimize the operation of multiple appliances and improve the profitability and user comfort compared to other methods.
引用
收藏
页数:9
相关论文
共 34 条
  • [1] Effects of dead-band and set-point settings of on/off controllers on the energy consumption and equipment, switching frequency of a residential HVAC system
    Afram, Abdul
    Janabi-Sharifi, Farrokh
    [J]. JOURNAL OF PROCESS CONTROL, 2016, 47 : 161 - 174
  • [2] DRL-HEMS: Deep Reinforcement Learning Agent for Demand Response in Home Energy Management Systems Considering Customers and Operators Perspectives
    Amer, Aya A.
    Shaban, Khaled
    Massoud, Ahmed M.
    [J]. IEEE TRANSACTIONS ON SMART GRID, 2023, 14 (01) : 239 - 250
  • [3] Low complexity energy management strategy for grid profile smoothing of a residential grid-connected microgrid using generation and demand forecasting
    Arcos-Aviles, Diego
    Pascual, Julio
    Guinjoan, Francesc
    Marroyo, Luis
    Sanchis, Pablo
    Marietta, Martin P.
    [J]. APPLIED ENERGY, 2017, 205 : 69 - 84
  • [4] Autonomous HVAC Control, A Reinforcement Learning Approach
    Barrett, Enda
    Linder, Stephen
    [J]. MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, PT III, 2015, 9286 : 3 - 19
  • [5] Brissette A., 2017, PROC 2017 IEEE C TEC, P1
  • [6] Brockman Greg, 2016, arXiv
  • [7] Federated-WDCGAN: A federated smart meter data sharing framework for privacy preservation
    Chen, Zhiqiang
    Li, Jianbin
    Cheng, Long
    Liu, Xiufeng
    [J]. APPLIED ENERGY, 2023, 334
  • [8] An advanced real time energy management system for microgrids
    Elsied, Moataz
    Oukaour, Amrane
    Youssef, Tarek
    Gualous, Hamid
    Mohammed, Osama
    [J]. ENERGY, 2016, 114 : 742 - 752
  • [9] An Event-Triggered Online Energy Management Algorithm of Smart Home: Lyapunov Optimization Approach
    Fan, Wei
    Liu, Nian
    Zhang, Jianhua
    [J]. ENERGIES, 2016, 9 (05)
  • [10] Deep Adversarial Imitation Reinforcement Learning for QoS-Aware Cloud Job Scheduling
    Huang, Yifeng
    Cheng, Long
    Xue, Lianting
    Liu, Cong
    Li, Yuancheng
    Li, Jianbin
    Ward, Tomas
    [J]. IEEE SYSTEMS JOURNAL, 2022, 16 (03): : 4232 - 4242