Deep Reinforcement Learning Based Pricing Strategy of Aggregators Considering Renewable Energy

被引:16
作者
Chuang, Yu-Chieh [1 ]
Chiu, Wei-Yu [1 ]
机构
[1] Natl Tsing Hua Univ, Dept Elect Engn, MOCaRL Lab, Hsinchu 300044, Taiwan
来源
IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE | 2022年 / 6卷 / 03期
关键词
Pricing; Renewable energy sources; Reinforcement learning; Deep learning; Smart grids; Aerospace electronics; Uncertainty; Deep reinforcement learning; smart grid; energy aggregator; pricing strategy; energy trading; renewable energy; DEMAND RESPONSE; MANAGEMENT; STORAGE; MODEL;
D O I
10.1109/TETCI.2021.3109954
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With the rapid development of information and communications technology and high penetration of renewable energy, the role of an aggregator in a smart grid has emerged to better coordinate power and cash flows between energy producers and consumers through the adjustment of pricing signals. This study proposes variation indices about the statistics of renewables and a control law for an energy storage system. A deep reinforcement learning based pricing strategy of an aggregator for profit maximization in consideration of the energy balance is developed accordingly. The proposed approach can consider opponents' behaviors, variability of renewables, and varying bounds of charging and discharging events in a nonstationary environment, which can be hardly addressed by pricing strategies based on conventional learning algorithms such as Q-learning and deep Q-network. Numerical analysis using real-world data shows that the proposed approach can outperform existing pricing strategies in terms of the learning speed and profit of aggregators.
引用
收藏
页码:499 / 508
页数:10
相关论文
共 36 条
[1]   Machine Learning Based Energy Management Model for Smart Grid and Renewable Energy Districts [J].
Ahmed, Waqar ;
Ansari, Hammad ;
Khan, Bilal ;
Ullah, Zahid ;
Ali, Sahibzada Muhammad ;
Mehmood, Chaudhry Arshad Arshad ;
Qureshi, Muhammad B. ;
Hussain, Iqrar ;
Jawad, Muhammad ;
Khan, Muhammad Usman Shahid ;
Ullah, Amjad ;
Nawaz, Raheel .
IEEE ACCESS, 2020, 8 :185059-185078
[2]   Reinforcement Mechanism Design for e-commerce [J].
Cai, Qingpeng ;
Filos-Ratsikas, Aris ;
Tang, Pingzhong ;
Zhang, Yiwei .
WEB CONFERENCE 2018: PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE (WWW2018), 2018, :1339-1348
[3]   User-Centric Multiobjective Approach to Privacy Preservation and Energy Cost Minimization in Smart Home [J].
Chang, Hsuan-Hao ;
Chiu, Wei-Yu ;
Sun, Hongjian ;
Chen, Chia-Ming .
IEEE SYSTEMS JOURNAL, 2019, 13 (01) :1030-1041
[4]   Stabilizing Reinforcement Learning in Dynamic Environment with Application to Online Recommendation [J].
Chen, Shi-Yong ;
Yu, Yang ;
Da, Qing ;
Tan, Jun ;
Huang, Hai-Kuan ;
Tang, Hai-Hong .
KDD'18: PROCEEDINGS OF THE 24TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2018, :1187-1196
[5]   Distributed Deep Reinforcement Learning for Intelligent Load Scheduling in Residential Smart Grids [J].
Chung, Hwei-Ming ;
Maharjan, Sabita ;
Zhang, Yan ;
Eliassen, Frank .
IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2021, 17 (04) :2752-2763
[6]  
GHOSH S, 2019, P C ART INT HON HAW, V33, P914
[7]  
Hausknecht M., 2015, 2015 AAAI FALL S SER
[8]   A Two-Layer Energy Management System for Microgrids With Hybrid Energy Storage Considering Degradation Costs [J].
Ju, Chengquan ;
Wang, Peng ;
Goel, Lalit ;
Xu, Yan .
IEEE TRANSACTIONS ON SMART GRID, 2018, 9 (06) :6047-6057
[9]   Power TAC: A competitive economic simulation of the smart grid [J].
Ketter, Wolfgang ;
Collins, John ;
Reddy, Prashant .
ENERGY ECONOMICS, 2013, 39 :262-270
[10]   Reinforcement Learning Based Energy Management Algorithm for Smart Energy Buildings [J].
Kim, Sunyong ;
Lim, Hyuk .
ENERGIES, 2018, 11 (08)