Real-Time Microgrid Energy Scheduling Using Meta-Reinforcement Learning

被引:2
作者
Shen, Huan [1 ]
Shen, Xingfa [1 ]
Chen, Yiming [1 ]
机构
[1] Hangzhou Dianzi Univ, Sch Comp Sci, Hangzhou 310018, Peoples R China
关键词
microgrid; energy management; meta-learning; reinforcement learning; online scheduling; RENEWABLE ENERGY; MANAGEMENT; OPERATION;
D O I
10.3390/en17102367
中图分类号
TE [石油、天然气工业]; TK [能源与动力工程];
学科分类号
0807 ; 0820 ;
摘要
With the rapid development of renewable energy and the increasing maturity of energy storage technology, microgrids are quickly becoming popular worldwide. The stochastic scheduling problem of microgrids can increase operational costs and resource wastage. In order to reduce operational costs and optimize resource utilization efficiency, the real-time scheduling of microgrids becomes particularly important. After collecting extensive data, reinforcement learning (RL) can provide good strategies. However, it cannot make quick and rational decisions in different environments. As a method with generalization ability, meta-learning can compensate for this deficiency. Therefore, this paper introduces a microgrid scheduling strategy based on RL and meta-learning. This method can quickly adapt to different environments with a small amount of training data, enabling rapid energy scheduling policy generation in the early stages of microgrid operation. This paper first establishes a microgrid model, including components such as energy storage, load, and distributed generation (DG). Then, we use a meta-reinforcement learning framework to train the initial scheduling strategy, considering the various operational constraints of the microgrid. The experimental results show that the MAML-based RL strategy has advantages in improving energy utilization and reducing operational costs in the early stages of microgrid operation. This research provides a new intelligent solution for microgrids' efficient, stable, and economical operation in their initial stages.
引用
收藏
页数:15
相关论文
共 34 条
[1]   Microgrid energy management using deep Q-network reinforcement learning [J].
Alabdullah, Mohammed H. ;
Abido, Mohammad A. .
ALEXANDRIA ENGINEERING JOURNAL, 2022, 61 (11) :9069-9078
[2]   Q-Learning-Based Operation Strategy for Community Battery Energy Storage System (CBESS) in Microgrid System [J].
Bui, Van-Hai ;
Hussain, Akhtar ;
Kim, Hak-Man .
ENERGIES, 2019, 12 (09)
[3]   Context-Enhanced Meta-Reinforcement Learning with Data-Reused Adaptation for Urban Autonomous Driving [J].
Deng, Qi ;
Zhao, Yaqian ;
Li, Rengang ;
Hu, Qifu ;
Liu, Tiejun ;
Li, Ruyang .
2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
[4]   Operation of a multiagent system for Microgrid control [J].
Dimeas, AL ;
Hatziargyriou, ND .
IEEE TRANSACTIONS ON POWER SYSTEMS, 2005, 20 (03) :1447-1455
[5]  
Ebert F, 2018, Arxiv, DOI [arXiv:1812.00568, 10.48550/arXiv.1812.00568]
[6]   Pruning Meta-Trained Networks for On-Device Adaptation [J].
Gao, Dawei ;
He, Xiaoxi ;
Zhou, Zimu ;
Tong, Yongxin ;
Thiele, Lothar .
PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, :514-523
[7]   Online Optimal Power Scheduling of a Microgrid via Imitation Learning [J].
Gao, Shuhua ;
Xiang, Cheng ;
Yu, Ming ;
Tan, Kuan Tak ;
Lee, Tong Heng .
IEEE TRANSACTIONS ON SMART GRID, 2022, 13 (02) :861-876
[8]   Model Predictive Control for Microgrid Functionalities: Review and Future Challenges [J].
Garcia-Torres, Felix ;
Zafra-Cabeza, Ascension ;
Silva, Carlos ;
Grieu, Stephane ;
Darure, Tejaswinee ;
Estanqueiro, Ana .
ENERGIES, 2021, 14 (05)
[9]   Meta-Learning in Neural Networks: A Survey [J].
Hospedales, Timothy ;
Antoniou, Antreas ;
Micaelli, Paul ;
Storkey, Amos .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (09) :5149-5169
[10]   Robust Optimization-Based Scheduling of Multi-Microgrids Considering Uncertainties [J].
Hussain, Akhtar ;
Bui, Van-Hai ;
Kim, Hak-Man .
ENERGIES, 2016, 9 (04)