ASSESSMENT OF SITUATION AWARENESS FOR SEAFARERS USING EYE-TRACKING DATA

被引:0
|
作者
Virdi, S. S. [1 ]
Ng, Yong Thiang [2 ]
Liu, Yisi [2 ]
Tan, Kelvin [2 ]
Zhang, Daniel [2 ]
机构
[1] SMA Singapore Polytech, Singapore, Singapore
[2] CEMS Singapore Polytech, Singapore, Singapore
关键词
Situation Awareness; Competence Assessment; Human Factors Study; Eye Tracking; Navigation Simulator; COLLISIONS; LESSONS;
D O I
暂无
中图分类号
P75 [海洋工程];
学科分类号
0814 ; 081505 ; 0824 ; 082401 ;
摘要
Situation Awareness (SA) is the perception of the current situation, comprehension of its meaning, and projection of what is going to happen in the near future. It is crucial for navigators to possess high SA in a navigational Bridge to mitigate the risk of human errors and to improve navigational safety. However, the current methodology to assess SA mainly rely on human experts, which might bring in potential problems such as bias, work overload, and it is also hard for the human experts to capture every fine detail of the behaviour of the seafarers being assessed. To overcome these, an objective and automated way to assess Situation Awareness is needed. In this work, eye-tracking data is used for the assessment of SA. With the eye tracking device, it is possible to localize where the navigator is looking at, and by applying computer vision with deep learning algorithm, the ongoing activity being executed by the navigator could be identified. In total 7 activities (using RADAR, ECDIS, checking of ship's heading, and speed, checking data on Echo Sounder, and data related to ships maneuvering, and others) can be recognized which are used as indicators of SA. A set of training data was recorded using Tobii Pro Glasses 3 to train the deep learning algorithm and test the classification accuracy. To further verify the proposed eye-tracking based assessment, a preliminary experiment has been designed and carried out. Five subjects were recruited for data collection. A full-mission Advanced Navigation Research Simulator (ANRS) was used to provide scenarios for both training data collection and preliminary experiment. From the initial results, it shows that a recognition accuracy of >99% can be achieved, which gives positive support to the eye-tracking based recognition. The analytics results using data from preliminary experiment also show great potential in using eye-tracking to assess SA of navigators. The proposed assessment could be used in both simulator and on-board and for multiple purposes such as performance evaluation, promotion to the next rank, and Continuing Professional Development.
引用
收藏
页数:7
相关论文
共 50 条
  • [1] Using Eye-Tracking to Measure Worker Situation Awareness in Augmented Reality
    Wu, Shaoze
    Chen, Haosen
    Hou, Lei
    Zhang, Guomin
    Li, Chun-Qing
    AUTOMATION IN CONSTRUCTION, 2024, 165
  • [2] A Comparison of Nurses' Situation Awareness and Eye-Tracking Data in Precardiac Arrest Simulations
    Lavoie, Patrick
    Lapierre, Alexandra
    Khetir, Imene
    Doherty, Amelie
    Thibodeau-Jarry, Nicolas
    Rousseau-Saine, Nicolas
    Cretaz, Maude
    Benhannache, Rania
    Mailhot, Tanya
    CLINICAL SIMULATION IN NURSING, 2023, 81
  • [3] Understanding drivers' situation awareness in highly automated driving using SAGAT, SART, and eye-tracking data
    Kim, Young Woo
    Yoon, Sol Hee
    TRANSPORTATION RESEARCH PART F-TRAFFIC PSYCHOLOGY AND BEHAVIOUR, 2025, 109 : 1437 - 1450
  • [4] Situation awareness of distracted walking based on eye-tracking study
    Akbar, Gilang H.
    Wijayanto, Titis
    Hartono, Budi
    COGNITION TECHNOLOGY & WORK, 2024,
  • [5] Improving Human-Robot Team Transparency with Eye-tracking based Situation Awareness Assessment
    Aderinto, Favour
    Smith, Josh Bhagat
    Giolando, Mark-Robin
    Baskaran, Prakash
    Adams, Julie A.
    COMPANION OF THE 2024 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, HRI 2024 COMPANION, 2024, : 177 - 181
  • [6] Using Eye-Tracking Data to Predict Situation Awareness in Real Time During Takeover Transitions in Conditionally Automated Driving
    Zhou, Feng
    Yang, X. Jessie
    de Winter, Joost C. F.
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2022, 23 (03) : 2284 - 2295
  • [7] Measuring Construction Workers' Real-Time Situation Awareness Using Mobile Eye-Tracking
    Hasanzadeh, Sogand
    Esmaeili, Behzad
    Dodd, Michael D.
    CONSTRUCTION RESEARCH CONGRESS 2016: OLD AND NEW CONSTRUCTION TECHNOLOGIES CONVERGE IN HISTORIC SAN JUAN, 2016, : 2894 - 2904
  • [8] Framework of Adaptive Driving: Linking Situation Awareness, Driving Goals, and Driving Intentions Using Eye-Tracking and Vehicle Kinetic Data
    Lai, Hsueh-Yi
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2025, 26 (03) : 3295 - 3306
  • [9] Driver Situation Awareness and Perceived Sleepiness during Truck Platoon Driving - Insights from Eye-tracking Data
    Castritius, Sarah-Maria
    Schubert, Patric
    Dietz, Christoph
    Hecht, Heiko
    Huestegge, Lynn
    Liebherr, Magnus
    Haas, Christian T.
    INTERNATIONAL JOURNAL OF HUMAN-COMPUTER INTERACTION, 2021, 37 (15) : 1467 - 1477
  • [10] Correlation Evaluation of Pilots' Situation Awareness in Bridge Simulations via Eye-Tracking Technology
    Jiang, Shaoqi
    Chen, Weijiong
    Kang, Yutao
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2021, 2021