Charging Efficiency Optimization Based on Swarm Reinforcement Learning Under Dynamic Energy Consumption for WRSN

被引:2
作者
Chen, Jingyang [1 ]
Li, Xiaohui [1 ]
Ding, Yuemin [2 ]
Cai, Bin [1 ]
He, Jie [1 ]
Zhao, Min [1 ]
机构
[1] Wuhan Univ Sci & Technol, Sch Informat Sci & Engn, Wuhan 430081, Hubei, Peoples R China
[2] Univ Navarra, Tecnun Sch Engn, Gipuzkoa 20009, Spain
基金
中国国家自然科学基金;
关键词
Optimization; Wireless sensor networks; Sensors; Energy consumption; Reinforcement learning; Convergence; Schedules; Charging efficiency optimization; swarm reinforcement learning (SRL); wireless rechargeable sensor network (WRSN); SENSOR NETWORKS; WIRELESS; ALGORITHM;
D O I
10.1109/JSEN.2024.3407748
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Wireless rechargeable sensor networks (WRSNs) have been widely used to solve the energy constraint problem of wireless sensor networks (WSNs). Improving the charging efficiency of the mobile charger (MC) is crucial for WRSN. However, the time-varying states of WRSN caused by dynamic changes in node energy consumption during the charging process of the MC make the optimization of charging efficiency rather difficult in WRSN. To solve this problem, a kind of optimization approach based on swarm reinforcement learning (SRL) is presented in this article. The presented approach lets the multiple agents better adapt to the dynamic energy distribution in WRSN by designing a dynamic energy consumption model. Then, it utilizes a rank-based ant system (AS(rank)) to ensure that the MC gets the initial optimal requesting sensor nodes (SNs), which contributes to accelerating the convergence speed of SRL. Finally, it adopts particle swarm optimization (PSO) to improve the learning effectiveness during the exchanges of experience among multiple agents, which contributes to optimizing the charging path of MC. Extensive simulations show that the presented approach achieves better charging performance than the existing typical approaches, and it has significant advantages in terms of charging efficiency, SN dead ratio, and MC energy efficiency ratio.
引用
收藏
页码:33427 / 33441
页数:15
相关论文
共 28 条
  • [1] A deep reinforcement learning-based on-demand charging algorithm for wireless rechargeable sensor networks
    Cao, Xianbo
    Xu, Wenzheng
    Liu, Xuxun
    Peng, Jian
    Liu, Tang
    [J]. AD HOC NETWORKS, 2021, 110
  • [2] SURESENSE: SUSTAINABLE WIRELESS RECHARGEABLE SENSOR NETWORKS FOR THE SMART GRID
    Erol-Kantarci, Melike
    Mouftah, Hussein T.
    [J]. IEEE WIRELESS COMMUNICATIONS, 2012, 19 (03) : 30 - 36
  • [3] Deep Reinforcement Learning-Based Online One-to-Multiple Charging Scheme in Wireless Rechargeable Sensor Network
    Gong, Zheng
    Wu, Hao
    Feng, Yong
    Liu, Nianbo
    [J]. SENSORS, 2023, 23 (08)
  • [4] Evaluating the On-Demand Mobile Charging in Wireless Sensor Networks
    He, Liang
    Kong, Linghe
    Gu, Yu
    Pan, Jianping
    Zhu, Ting
    [J]. IEEE TRANSACTIONS ON MOBILE COMPUTING, 2015, 14 (09) : 1861 - 1875
  • [5] He LA, 2010, IEEE VTS VEH TECHNOL
  • [6] Energy Provisioning in Wireless Rechargeable Sensor Networks
    He, Shibo
    Chen, Jiming
    Jiang, Fachang
    Yau, David K. Y.
    Xing, Guoliang
    Sun, Youxian
    [J]. IEEE TRANSACTIONS ON MOBILE COMPUTING, 2013, 12 (10) : 1931 - 1942
  • [7] Rate allocation and network lifetime problems for wireless sensor networks
    Hou, Y. Thomas
    Shi, Yi
    Sherali, Hanif D.
    [J]. IEEE-ACM TRANSACTIONS ON NETWORKING, 2008, 16 (02) : 321 - 334
  • [8] Iima H, 2008, IEEE SYS MAN CYBERN, P1109
  • [9] Efficient wireless non-radiative mid-range energy transfer
    Karalis, Aristeidis
    Joannopoulos, J. D.
    Soljacic, Marin
    [J]. ANNALS OF PHYSICS, 2008, 323 (01) : 34 - 48
  • [10] A Survey on Mobile Charging Techniques in Wireless Rechargeable Sensor Networks
    Kaswan, Amar
    Jana, Prasanta K.
    Das, Sajal K.
    [J]. IEEE COMMUNICATIONS SURVEYS AND TUTORIALS, 2022, 24 (03): : 1750 - 1779