Developing an eco-driving strategy in a hybrid traffic network using reinforcement learning

被引:5
作者
Jamil, Umar [1 ]
Malmir, Mostafa [1 ]
Chen, Alan [2 ]
Filipovska, Monika [3 ]
Xie, Mimi [4 ]
Ding, Caiwen [5 ]
Jin, Yu-Fang [1 ]
机构
[1] Univ Texas San Antonio, Dept Elect & Comp Engn, San Antonio, TX 78249 USA
[2] Westlake High Sch, Austin, TX USA
[3] Univ Connecticut, Dept Civil Engn, Storrs, CT USA
[4] Univ Texas San Antonio, Dept Comp Sci, San Antonio, TX 78249 USA
[5] Univ Connecticut, Dept Comp Sci & Engn, Storrs, CT USA
基金
美国国家科学基金会;
关键词
Eco-driving; hybrid traffic network; reinforcement learning; traffic flow control; fuel consumption; microscopic traffic simulator; SUSTAINABILITY; INTERSECTIONS; VEHICLES;
D O I
10.1177/00368504241263406
中图分类号
G40 [教育学];
学科分类号
040101 ; 120403 ;
摘要
Eco-driving has garnered considerable research attention owing to its potential socio-economic impact, including enhanced public health and mitigated climate change effects through the reduction of greenhouse gas emissions. With an expectation of more autonomous vehicles (AVs) on the road, an eco-driving strategy in hybrid traffic networks encompassing AV and human-driven vehicles (HDVs) with the coordination of traffic lights is a challenging task. The challenge is partially due to the insufficient infrastructure for collecting, transmitting, and sharing real-time traffic data among vehicles, facilities, and traffic control centers, and the following decision-making of agents involved in traffic control. Additionally, the intricate nature of the existing traffic network, with its diverse array of vehicles and facilities, contributes to the challenge by hindering the development of a mathematical model for accurately characterizing the traffic network. In this study, we utilized the Simulation of Urban Mobility (SUMO) simulator to tackle the first challenge through computational analysis. To address the second challenge, we employed a model-free reinforcement learning (RL) algorithm, proximal policy optimization, to decide the actions of AV and traffic light signals in a traffic network. A novel eco-driving strategy was proposed by introducing different percentages of AV into the traffic flow and collaborating with traffic light signals using RL to control the overall speed of the vehicles, resulting in improved fuel consumption efficiency. Average rewards with different penetration rates of AV (5%, 10%, and 20% of total vehicles) were compared to the situation without any AV in the traffic flow (0% penetration rate). The 10% penetration rate of AV showed a minimum time of convergence to achieve average reward, leading to a significant reduction in fuel consumption and total delay of all vehicles.
引用
收藏
页数:24
相关论文
共 32 条
[1]  
[Anonymous], 2020, USE ENERGY EXPLAINED
[2]   Approaches to Achieve Sustainability in Traffic Management [J].
Boltze, Manfred ;
Vu Anh Tuan .
PROCEEDING OF SUSTAINABLE DEVELOPMENT OF CIVIL, URBAN AND TRANSPORTATION ENGINEERING, 2016, 142 :205-212
[3]   Reinforcement Learning-Based Guidance of Autonomous Vehicles [J].
Clemmons, Joseph ;
Jin, Yu-Fang .
2023 24TH INTERNATIONAL SYMPOSIUM ON QUALITY ELECTRONIC DESIGN, ISQED, 2023, :496-501
[4]  
Federal Highway Administration, 2022, Chapter 12 - Signalized Intersections: Informational Guide
[5]   Hybrid deep reinforcement learning based eco-driving for low-level connected and automated vehicles along signalized corridors [J].
Guo, Qiangqiang ;
Angah, Ohay ;
Liu, Zhijun ;
Ban, Xuegang .
TRANSPORTATION RESEARCH PART C-EMERGING TECHNOLOGIES, 2021, 124
[6]  
Hao P., 2020, Developing an adaptive strategy for connected eco-driving under uncertain traffic and signal conditions
[7]   Impact of Deep RL-based Traffic Signal Control on Air Quality [J].
Haydari, Ammar ;
Zhang, Michael ;
Chuah, Chen-Nee ;
Ghosal, Dipak .
2021 IEEE 93RD VEHICULAR TECHNOLOGY CONFERENCE (VTC2021-SPRING), 2021,
[8]  
Hua Wei, 2020, ACM SIGKDD Explorations Newsletter, V22, P12, DOI 10.1145/3447556.3447565
[9]   Eco-driving technology for sustainable road transport: A review [J].
Huang, Yuhan ;
Ng, Elvin C. Y. ;
Zhou, John L. ;
Surawski, Nic C. ;
Chan, Edward F. C. ;
Hong, Guang .
RENEWABLE & SUSTAINABLE ENERGY REVIEWS, 2018, 93 :596-609
[10]  
Jamil U., 2023, IEEE, P1