The effects of augmented reality head-up displays on drivers' eye scan patterns, performance, and perceptions

被引:24
作者
Smith M. [1 ]
Gabbard J.L. [2 ]
Burnett G. [3 ]
Doutcheva N. [1 ]
机构
[1] Virginia Tech, Blacksburg, VA
[2] Grado Department of Industrial and Systems Engineering, Virginia Tech, Blacksburg
[3] Human Factors Research Group, Faculty of Engineering, University of Nottingham, Nottingham
基金
美国国家科学基金会;
关键词
Glance Patterns; Head-Down Display; Head-Up Display; Vehicle Display System; Visual Display Image;
D O I
10.4018/IJMHCI.2017040101
中图分类号
学科分类号
摘要
This paper reports on an experiment comparing Head-Up Display (HUD) and Head-Down Display (HDD) use while driving in a simulator to explore differences in glance patterns, driving performance, and user preferences. Sixteen participants completed both structured (text) and semi-structured (grid) visual search tasks on each display while following a lead vehicle in a motorway (highway) environment. Participants experienced three levels of complexity (low, medium, high) for each visual search task, with five repetitions of each level of complexity. Results suggest that the grid task was not sensitive enough to the varying visual demands, while the text task showed significant differences between displays in user preference, perceived workload, and distraction. As complexity increased, HUD use during the text task corresponded with faster performance as compared to the HDD, indicating the potential benefits when using HUDs in the driving context. Furthermore, HUD use was associated with longer sustained glances (at the respective display) as compared to the HDD, with no differences in driving performance observed. This finding suggests that AR HUDs afford longer glances without negatively affecting the longitudinal and lateral control of the vehicle - a result that has implications for how future researchers should evaluate the visual demands for AR HUDs. © 2017, IGI Global.
引用
收藏
页码:1 / 17
页数:16
相关论文
共 17 条
  • [1] Angell L.S., Auflick J., Austria P., Kochhar D.S., Tijerina L., Biever W., Driver Workload Metrics Task 2 Final Report, (2006)
  • [2] Azuma R.T., A survey of augmented reality, Presence (Cambridge, Mass.), 6, 4, pp. 355-385, (1997)
  • [3] Bolton A., Burnett G., Large D.R., An investigation of augmented reality presentations of landmarkbased navigation using a head-up display, Paper Present at the7th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, (2015)
  • [4] Edgar G.K., Accommodation, cognition, and virtual image displays: A review of the literature, Displays, 28, 2, pp. 45-59, (2007)
  • [5] Gabbard J.L., Fitch G.M., Kim H., Behind the glass: Driver challenges and opportunities for ar automotive applications, Proceedings of the IEEE, 102, 2, pp. 124-136, (2014)
  • [6] Haeuslschmid R., Schnurr L., Wagner J., Butz A., Contact-analog warnings on windshield displays promote monitoring the road scene, Paper Present at the7th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, (2015)
  • [7] Hart S.G., NASA-task load index (NASA-TLX)
  • [8] 20 years later, Paper Present at TheProceedings of the Human Factors and Ergonomics Society Annual Meeting, (2006)
  • [9] Hart S.G., Staveland L.E., Development of NASA-TLX (Task Load Index): Results of empirical and theoretical research, Advances in Psychology, 52, pp. 139-183, (1988)
  • [10] ISO 9241-3: Ergonomic Requirements for Office Work with Visual Display Terminals (VDTs): Part 3: Visual Display Requirements, (1992)