Visual exploration of omnidirectional panoramic scenes

被引:6
作者
Bischof, Walter F. [1 ]
Anderson, Nicola C. [1 ]
Doswell, Michael T. [1 ]
Kingstone, Alan [1 ]
机构
[1] Univ British Columbia, Vancouver, BC, Canada
来源
JOURNAL OF VISION | 2020年 / 20卷 / 07期
基金
加拿大自然科学与工程研究理事会;
关键词
eye movements; head movements; gaze-head relationship; omnidirectional panoramic scenes; EYE-HEAD COORDINATION; GAZE ALLOCATION; MOVEMENTS; FIXATION; SALIENCY; WORLD; PERCEPTION; DIRECTION; PATTERNS; SEARCH;
D O I
10.1167/jov.20.7.23
中图分类号
R77 [眼科学];
学科分类号
100212 ;
摘要
How do we explore the visual environment around us, and how are head and eye movements coordinated during our exploration? To investigate this question, we had observers look at omnidirectional panoramic scenes, composed of both landscape and fractal images, using a virtual reality viewer while their eye and head movements were tracked. We analyzed the spatial distribution of eye fixations and the distribution of saccade directions and the spatial distribution of head positions and the distribution of head shifts, as well as the relation between eye and head movements. The results show that, for landscape scenes, eye and head behavior best fit the allocentric frame defined by the scene horizon, especially when head tilt (i.e., head rotation around the view axis) is considered. For fractal scenes, which have an isotropic texture, eye and head movements were executed primarily along the cardinal directions in world coordinates. The results also show that eye and head movements are closely linked in space and time in a complementary way, with stimulus-driven eye movements predominantly leading the head movements. Our study is the first to systematically examine eye and head movements in a panoramic virtual reality environment, and the results demonstrate that a virtual reality environment constitutes a powerful and informative research alternative to traditional methods for investigating looking behavior.
引用
收藏
页数:29
相关论文
共 50 条
  • [21] Visual exploration and action processing in schizophrenia
    Delerue, Celine
    Boucart, Muriel
    COGNITIVE NEUROPSYCHIATRY, 2013, 18 (03) : 153 - 167
  • [22] Impact of dynamic bottom-up features and top-down control on the visual exploration of moving real-world scenes in hemispatial neglect
    Machner, Bjoern
    Dorr, Michael
    Sprenger, Andreas
    von der Gablentz, Janina
    Heide, Wolfgang
    Barth, Erhardt
    Helmchen, Christoph
    NEUROPSYCHOLOGIA, 2012, 50 (10) : 2415 - 2425
  • [23] Consequences of Beauty: Effects of Rater Sex and Sexual Orientation on the Visual Exploration and Evaluation of Attractiveness in Real World Scenes
    Mitrovic, Aleksandra
    Tinio, PabloP. L.
    Leder, Helmut
    FRONTIERS IN HUMAN NEUROSCIENCE, 2016, 10
  • [24] The attraction of visual attention to texts in real-world scenes
    Wang, Hsueh-Cheng
    Pomplun, Marc
    JOURNAL OF VISION, 2012, 12 (06):
  • [25] Impact of Simulated Central Scotomas on Visual Search in Natural Scenes
    McIlreavy, Lee
    Fiser, Jozsef
    Bex, Peter J.
    OPTOMETRY AND VISION SCIENCE, 2012, 89 (09) : 1385 - 1394
  • [26] Visual conspicuity of objects in complex scenes
    Boersema, T
    Zwaga, HJG
    HUMAN VISION AND ELECTRONIC IMAGING V, 2000, 3959 : 548 - 558
  • [27] The visual light field in real scenes
    Xia, Ling
    Pont, Sylvia C.
    Heynderickx, Ingrid
    I-PERCEPTION, 2014, 5 (07): : 613 - 629
  • [28] Rehabilitation of reading and visual exploration in visual field disorders: transfer or specificity?
    Schuett, Susanne
    Heywood, Charles A.
    Kentridge, Robert W.
    Dauner, Ruth
    Zihl, Josef
    BRAIN, 2012, 135 : 912 - 921
  • [29] Processing of unattended emotional visual scenes
    Calvo, Manuel G.
    Nummenmaa, Lauri
    JOURNAL OF EXPERIMENTAL PSYCHOLOGY-GENERAL, 2007, 136 (03) : 347 - 369
  • [30] Flipping the world upside down: Using eye tracking in virtual reality to study visual search in inverted scenes
    Beitner, Julia
    Helbing, Jason
    Draschkow, Dejan
    David, Erwan J.
    Vo, Melissa L. -H.
    JOURNAL OF EYE MOVEMENT RESEARCH, 2022, 15 (03):