Deep reinforcement learning based optimization of automated guided vehicle time and energy consumption in a container terminal

被引:36
作者
Drungilas, Darius [1 ]
Kurmis, Mindaugas [1 ]
Senulis, Audrius [1 ]
Lukosius, Zydrunas [1 ]
Andziulis, Arunas [1 ]
Januteniene, Jolanta [1 ]
Bogdevicius, Marijonas [1 ]
Jankunas, Valdas [1 ]
Voznak, Miroslav [1 ,2 ]
机构
[1] Klaipeda Univ, H Manto Str 84, LT-92294 Klaipeda, Lithuania
[2] VSB Tech Univ Ostrava, Dept Telecommun, 17 Listopadu 2172-15, Ostrava 70800, Czech Republic
关键词
Automated guided vehicle (AGV); Container terminal; Energy consumption; Deep reinforcement learning; Modeling; Optimization; STRATEGIES; MANAGEMENT; EFFICIENCY; BATTERY; PORT;
D O I
10.1016/j.aej.2022.12.057
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
The energy efficiency of port container terminal equipment and the reduction of CO2 emissions are among one of the biggest challenges facing every seaport in the world. The article pre-sents the modeling of the container transportation process in a terminal from the quay crane to the stack using battery-powered Automated Guided Vehicle (AGV) to estimate the energy consump-tion parameters. An AGV speed control algorithm based on Deep Reinforcement Learning (DRL) is proposed to optimize the energy consumption of container transportation. The results obtained and compared with real transportation measurements showed that the proposed DRL-based approach dynamically changing the driving speed of the AGV reduces energy consumption by 4.6%. The obtained results of the research provide the prerequisites for further research in order to find optimal strategies for autonomous vehicle movement including context awareness and infor-mation sharing with other vehicles in the terminal.(c) 2022 THE AUTHORS. Published by Elsevier BV on behalf of Faculty of Engineering, Alexandria University. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/ licenses/by-nc-nd/4.0/).
引用
收藏
页码:397 / 407
页数:11
相关论文
共 36 条
[1]   Renewable power source energy consumption by hybrid machine learning model [J].
Abd El-Aziz, Rasha M. .
ALEXANDRIA ENGINEERING JOURNAL, 2022, 61 (12) :9447-9455
[2]   Energy management in seaports: A new role for port authorities [J].
Acciaro, Michele ;
Ghiara, Hilda ;
Cusano, Maria Ines .
ENERGY POLICY, 2014, 71 :4-12
[3]   Microgrid energy management using deep Q-network reinforcement learning [J].
Alabdullah, Mohammed H. ;
Abido, Mohammad A. .
ALEXANDRIA ENGINEERING JOURNAL, 2022, 61 (11) :9069-9078
[5]  
[Anonymous], 2022, Greenhouse gas emission intensity of electricity generation in Europe
[6]   Efficient routing for multi-AGV based on optimized Ant-agent [J].
Chen, Jinwen ;
Zhang, Xiaoli ;
Peng, Xiafu ;
Xu, Dongsheng ;
Peng, Jincheng .
COMPUTERS & INDUSTRIAL ENGINEERING, 2022, 167
[7]  
Cong Wang, 2019, 2019 IEEE International Conference on Smart Manufacturing, Industrial & Logistics Engineering (SMILE). Proceedings, P195, DOI 10.1109/SMILE45626.2019.8965316
[8]   Electrical Motor Drivelines in Commercial All-Electric Vehicles: A Review [J].
de Santiago, Juan ;
Bernhoff, Hans ;
Ekergard, Boel ;
Eriksson, Sandra ;
Ferhatovic, Senad ;
Waters, Rafael ;
Leijon, Mats .
IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2012, 61 (02) :475-484
[9]   Minimizing Carbon Dioxide Emissions Due to Container Handling at Marine Container Terminals via Hybrid Evolutionary Algorithms [J].
Dulebenets, Maxim A. ;
Moses, Ren ;
Ozguven, Eren E. ;
Vanli, Arda .
IEEE ACCESS, 2017, 5 :8131-8147
[10]   Evaluation of the energy consumption of container diesel trucks in a container terminal: A case study at Klaipeda port [J].
Eglynas, Tomas ;
Jakovlev, Sergej ;
Jankunas, Valdas ;
Didziokas, Rimantas ;
Januteniene, Jolanta ;
Drungilas, Darius ;
Jusis, Mindaugas ;
Pocevicius, Edvinas ;
Bogdevicius, Marijonas ;
Andziulis, Arunas .
SCIENCE PROGRESS, 2021, 104 (03)