ARAGAN: A dRiver Attention estimation model based on conditional Generative Adversarial Network

被引:4
作者
Araluce, Javier [1 ]
Bergasa, Luis M. [1 ]
Ocana, Manuel [1 ]
Barea, Rafael [1 ]
Lopez-Guillen, Elena [1 ]
Revenga, Pedro [1 ]
机构
[1] Univ Alcala UAH, Elect Dept, Alcala De Henares, Spain
来源
2022 IEEE INTELLIGENT VEHICLES SYMPOSIUM (IV) | 2022年
关键词
D O I
10.1109/IV51971.2022.9827175
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Predicting driver's attention in complex driving scenarios is becoming a hot topic due to it helps the design of some autonomous driving tasks, optimizing visual scene understanding and contributing knowledge to the decision making. We introduce ARAGAN, a driver attention estimation model based on a conditional Generative Adversarial Network (cGAN). This architecture uses some of the most challenging and novel deep learning techniques to develop this task. It fuses adversarial learning with Multi-Head Attention mechanisms. To the best of our knowledge, this combination has never been applied to predict driver's attention. Adversarial mechanism learns to map an attention image from an RGB traffic image while mapping the loss function. Attention mechanism contributes to the deep learning paradigm finding the most interesting feature maps inside the tensors of the net. In this work, we have adapted this concept to find the saliency areas in a driving scene. An ablation study with different architectures has been carried out, obtained the results in terms of some saliency metrics. Besides, a comparison with other state-of-the-art models has been driven, outperforming results in accuracy and performance, and showing that our proposal is adequate to be used on real-time applications. ARAGAN has been trained in BDDA and tested in BDDA and DADA2000, which are two of the most complex driver attention datasets available for research.
引用
收藏
页码:1066 / 1072
页数:7
相关论文
共 50 条
[21]   Saliency Detection by Conditional Generative Adversarial Network [J].
Cai, Xiaoxu ;
Yu, Hui .
NINTH INTERNATIONAL CONFERENCE ON GRAPHIC AND IMAGE PROCESSING (ICGIP 2017), 2018, 10615
[22]   Polarization imaging shadow removal based on attention conditional generative adversarial networks [J].
Xu, Guoming ;
Cao, Ang ;
Wang, Feng ;
Ma, Jian ;
Li, Yi .
JOURNAL OF ELECTRONIC IMAGING, 2024, 33 (01)
[23]   Carbon market risk estimation using quantum conditional generative adversarial network and amplitude estimation [J].
Zhou, Xiyuan ;
Zhao, Huan ;
Cao, Yuji ;
Fei, Xiang ;
Liang, Gaoqi ;
Zhao, Junhua .
ENERGY CONVERSION AND ECONOMICS, 2024, 5 (04) :193-210
[24]   MSTGAN: A Multi-Slot Conditional Generative Adversarial Network Based on Swin Transformer for Channel Estimation [J].
Cheng, Lujie ;
Zhang, Zhi ;
Dong, Chen ;
Liu, Sirui .
IEEE COMMUNICATIONS LETTERS, 2023, 27 (07) :1799-1803
[25]   Conditional generative adversarial network for EEG-based emotion fine-grained estimation and visualization [J].
Fu, Boxun ;
Li, Fu ;
Niu, Yi ;
Wu, Hao ;
Li, Yang ;
Shi, Guangming .
JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2021, 74
[26]   RME-GAN: A Learning Framework for Radio Map Estimation Based on Conditional Generative Adversarial Network [J].
Zhang S. ;
Wijesinghe A. ;
Ding Z. .
IEEE Internet of Things Journal, 2023, 10 (20) :18016-18027
[27]   Single Image Defogging Algorithm Based on Conditional Generative Adversarial Network [J].
Ma, Rui-Qiang ;
Shen, Xing-Run ;
Zhang, Shan-Jun .
MATHEMATICAL PROBLEMS IN ENGINEERING, 2020, 2020
[28]   Pavement Cracks Segmentation Algorithm Based on Conditional Generative Adversarial Network [J].
Kang, Jie ;
Feng, Shujie .
SENSORS, 2022, 22 (21)
[29]   HDR image generation method based on conditional generative adversarial network [J].
Bei Y. ;
Wang Q. ;
Cheng Z. ;
Pan X. ;
Yang M. ;
Ding D. .
Beijing Hangkong Hangtian Daxue Xuebao/Journal of Beijing University of Aeronautics and Astronautics, 2021, 48 (01) :45-52
[30]   Image super-resolution based on conditional generative adversarial network [J].
Gao, Hongxia ;
Chen, Zhanhong ;
Huang, Binyang ;
Chen, Jiahe ;
Li, Zhifu .
IET IMAGE PROCESSING, 2020, 14 (13) :3006-3013