An interactive eye-tracking system for measuring radiologists' visual fixations in volumetric CT images: Implementation and initial eye-tracking accuracy validation

被引:8
|
作者
Gong, Hao [1 ]
Hsieh, Scott S. [1 ]
Holmes, David R., III
Cook, David A.
Inoue, Akitoshi [1 ]
Bartlett, David J. [1 ,2 ,3 ]
Baffour, Francis [1 ]
Takahashi, Hiroaki [1 ]
Leng, Shuai [1 ]
Yu, Lifeng [1 ]
McCollough, Cynthia H. [1 ]
Fletcher, Joel G. [1 ]
机构
[1] Mayo Clin, Dept Radiol, Rochester, MN 55901 USA
[2] Mayo Clin, Dept Physiol & Biomed Engn, Rochester, MN USA
[3] Mayo Clin, Dept Internal Med, Rochester, MN USA
基金
美国国家卫生研究院;
关键词
biofeedback; computed tomography; eye tracking; observer performance; OBSERVER PERFORMANCE; EXPERTISE; GAZE;
D O I
10.1002/mp.15219
中图分类号
R8 [特种医学]; R445 [影像诊断学];
学科分类号
1002 ; 100207 ; 1009 ;
摘要
Purpose Eye-tracking approaches have been used to understand the visual search process in radiology. However, previous eye-tracking work in computer tomography (CT) has been limited largely to single cross-sectional images or video playback of the reconstructed volume, which do not accurately reflect radiologists' visual search activities and their interactivity with three-dimensional image data at a computer workstation (e.g., scroll, pan, and zoom) for visual evaluation of diagnostic imaging targets. We have developed a platform that integrates eye-tracking hardware with in-house-developed reader workstation software to allow monitoring of the visual search process and reader-image interactions in clinically relevant reader tasks. The purpose of this work is to validate the spatial accuracy of eye-tracking data using this platform for different eye-tracking data acquisition modes. Methods An eye-tracker was integrated with a previously developed workstation designed for reader performance studies. The integrated system captured real-time eye movement and workstation events at 1000 Hz sampling frequency. The eye-tracker was operated either in head-stabilized mode or in free-movement mode. In head-stabilized mode, the reader positioned their head on a manufacturer-provided chinrest. In free-movement mode, a biofeedback tool emitted an audio cue when the head position was outside the data collection range (general biofeedback) or outside a narrower range of positions near the calibration position (strict biofeedback). Four radiologists and one resident were invited to participate in three studies to determine eye-tracking spatial accuracy under three constraint conditions: head-stabilized mode (i.e., with use of a chin rest), free movement with general biofeedback, and free movement with strict biofeedback. Study 1 evaluated the impact of head stabilization versus general or strict biofeedback using a cross-hair target prior to the integration of the eye-tracker with the image viewing workstation. In Study 2, after integration of the eye-tracker and reader workstation, readers were asked to fixate on targets that were randomly distributed within a volumetric digital phantom. In Study 3, readers used the integrated system to scroll through volumetric patient CT angiographic images while fixating on the centerline of designated blood vessels (from the left coronary artery to dorsalis pedis artery). Spatial accuracy was quantified as the offset between the center of the intended target and the detected fixation using units of image pixels and the degree of visual angle. Results The three head position constraint conditions yielded comparable accuracy in the studies using digital phantoms. For Study 1 involving the digital crosshairs, the median +/- the standard deviation of offset values among readers were 15.2 +/- 7.0 image pixels with the chinrest, 14.2 +/- 3.6 image pixels with strict biofeedback, and 19.1 +/- 6.5 image pixels with general biofeedback. For Study 2 using the random dot phantom, the median +/- standard deviation offset values were 16.7 +/- 28.8 pixels with use of a chinrest, 16.5 +/- 24.6 pixels using strict biofeedback, and 18.0 +/- 22.4 pixels using general biofeedback, which translated to a visual angle of about 0.8 degrees for all three conditions. We found no obvious association between eye-tracking accuracy and target size or view time. In Study 3 viewing patient images, use of the chinrest and strict biofeedback demonstrated comparable accuracy, while the use of general biofeedback demonstrated a slightly worse accuracy. The median +/- standard deviation of offset values were 14.8 +/- 11.4 pixels with use of a chinrest, 21.0 +/- 16.2 pixels using strict biofeedback, and 29.7 +/- 20.9 image pixels using general biofeedback. These corresponded to visual angles ranging from 0.7 degrees to 1.3 degrees. Conclusions An integrated eye-tracker system to assess reader eye movement and interactive viewing in relation to imaging targets demonstrated reasonable spatial accuracy for assessment of visual fixation. The head-free movement condition with audio biofeedback performed similarly to head-stabilized mode.
引用
收藏
页码:6710 / 6723
页数:14
相关论文
共 50 条
  • [31] Representing Hierarchies of Visual Regard in Eye-Tracking Analysis
    Han, Eugene
    LEONARDO, 2022, 55 (01) : 51 - 56
  • [32] Visual foraging of highlighted text: An eye-tracking study
    Chi, Ed H.
    Gumbrecht, Michelle
    Hong, Lichan
    HUMAN-COMPUTER INTERACTION, PT 3, PROCEEDINGS, 2007, 4552 : 589 - +
  • [33] Correction to: Quantification of visual inducement of knots by eye-tracking
    Masashi Nakamura
    Takayuki Kondo
    Journal of Wood Science, 2018, 64 : 881 - 881
  • [34] Pain Affects Visual Orientation: an Eye-Tracking Study
    Schmidt, Katharina
    Gamer, Matthias
    Forkmann, Katarina
    Bingel, Ulrike
    JOURNAL OF PAIN, 2018, 19 (02): : 135 - 145
  • [35] Implementation of Eye-tracking System Based on Circular Hough Transform Algorithm
    Bozomitu, Radu Gabriel
    Pasarica, Alexandru
    Cehan, Vlad
    Lupu, Robert Gabriel
    Rotariu, Cristian
    Coca, Eugen
    2015 E-HEALTH AND BIOENGINEERING CONFERENCE (EHB), 2015,
  • [36] An elaborate algorithm for automatic processing of eye movement data and identifying fixations in eye-tracking experiments
    Liu, Bo
    Zhao, Qi-Chao
    Ren, Yuan-Yuan
    Wang, Qing-Ju
    Zheng, Xue-Lian
    ADVANCES IN MECHANICAL ENGINEERING, 2018, 10 (05)
  • [37] Design and Validation of a System to Synchronize Speech Recognition and Eye-Tracking Measurements
    Boulay, Emma
    Wallace, Bruce
    Fraser, Kathleen C.
    Kunz, Manuela
    Goubran, Rafik
    Knoefel, Frank
    Thomas, Neil
    2023 IEEE SENSORS APPLICATIONS SYMPOSIUM, SAS, 2023,
  • [38] Eye Tracking During Interactive Face Perception: Does Speech Affect Eye-Tracking Data Quality?
    Holleman, Gijs A.
    Hessels, Roy S.
    Kemner, Chantal
    Hooge, Ignace T. C.
    PERCEPTION, 2019, 48 : 118 - 118
  • [39] Interactive Screening of Learning Issues Via Eye-tracking Technology
    Paracha, Samiullah
    Takahara, Toshiro
    Inoue, Ayaka
    2016 13th Learning and Technology Conference (L&T), 2016, : 6 - 11
  • [40] Eye-tracking Analysis of Interactive 3D Geovisualization
    Herman, Lukas
    Popelka, Stanislav
    Hejlova, Vendula
    JOURNAL OF EYE MOVEMENT RESEARCH, 2017, 10 (03):