Language-driven anticipatory eye movements in virtual reality

被引:17
作者
Eichert, Nicole [1 ,2 ]
Peeters, David [1 ]
Hagoort, Peter [1 ,3 ]
机构
[1] Max Planck Inst Psycholinguist, Nijmegen, Netherlands
[2] Univ Oxford, Oxford, England
[3] Radboud Univ Nijmegen, Donders Inst Brain Cognit & Behav, Nijmegen, Netherlands
关键词
Virtual Reality; Prediction; Language Comprehension; Eyetracking; Visual World; SPOKEN LANGUAGE; TIME-COURSE; PREDICTION; FIXATION; INFORMATION; SPEECH; WORLD; INTEGRATION; PICTURES; MODELS;
D O I
10.3758/s13428-017-0929-z
中图分类号
B841 [心理学研究方法];
学科分类号
040201 ;
摘要
Predictive language processing is often studied by measuring eye movements as participants look at objects on a computer screen while they listen to spoken sentences. This variant of the visual-world paradigm has revealed that information encountered by a listener at a spoken verb can give rise to anticipatory eye movements to a target object, which is taken to indicate that people predict upcoming words. The ecological validity of such findings remains questionable, however, because these computer experiments used two-dimensional stimuli that were mere abstractions of real-world objects. Here we present a visual-world paradigm study in a three-dimensional (3-D) immersive virtual reality environment. Despite significant changes in the stimulus materials and the different mode of stimulus presentation, language-mediated anticipatory eye movements were still observed. These findings thus indicate that people do predict upcoming words during language comprehension in a more naturalistic setting where natural depth cues are preserved. Moreover, the results confirm the feasibility of using eyetracking in rich and multimodal 3-D virtual environments.
引用
收藏
页码:1102 / 1115
页数:14
相关论文
共 50 条
  • [31] Eye movements to absent objects during mental imagery and visual memory in immersive virtual reality
    Sandra Chiquet
    Corinna S. Martarelli
    Fred W. Mast
    Virtual Reality, 2021, 25 : 655 - 667
  • [32] Identification of Language-Induced Mental Load from Eye Behaviors in Virtual Reality
    Schirm, Johannes
    Gomez-Vargas, Andres Roberto
    Perusquia-Hernandez, Monica
    Skarbez, Richard T.
    Isoyama, Naoya
    Uchiyama, Hideaki
    Kiyokawa, Kiyoshi
    SENSORS, 2023, 23 (15)
  • [33] Temporal dynamics and uncertainty in binaural hearing revealed by anticipatory eye movements
    Winn, Matthew B.
    Kan, Alan
    Litovsky, Ruth Y.
    JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA, 2019, 145 (02) : 676 - 691
  • [34] Changes in visual speed perception induced by anticipatory smooth eye movements
    Miyamoto, Takeshi
    Numasawa, Kosuke
    Ono, Seiji
    JOURNAL OF NEUROPHYSIOLOGY, 2022, 127 (04) : 1198 - 1207
  • [35] Look-ahead fixations: anticipatory eye movements in natural tasks
    Mennie, Neil
    Hayhoe, Mary
    Sullivan, Brian
    EXPERIMENTAL BRAIN RESEARCH, 2007, 179 (03) : 427 - 442
  • [36] Humans adapt their anticipatory eye movements to the volatility of visual motion properties
    Pasturel, Chloe
    Montagnini, Anna
    Perrinet, Laurent Udo
    PLOS COMPUTATIONAL BIOLOGY, 2020, 16 (04)
  • [37] Eye Movements and Vestibulo-Ocular Reflex as User Response in Virtual Reality
    Palomino-Roldan, Geovanny
    Rojas-Cessa, Roberto
    Suaste-Gomez, Ernesto
    IEEE ACCESS, 2023, 11 : 36856 - 36864
  • [38] Eye and head movements while encoding and recognizing panoramic scenes in virtual reality
    Bischof, Walter F.
    Anderson, Nicola C.
    Kingstone, Alan
    PLOS ONE, 2023, 18 (02):
  • [39] Look-ahead fixations: anticipatory eye movements in natural tasks
    Neil Mennie
    Mary Hayhoe
    Brian Sullivan
    Experimental Brain Research, 2007, 179 : 427 - 442
  • [40] Verb-mediated anticipatory eye movements in people with Down syndrome
    Arias-Trejo, Natalia
    Angulo-Chavira, Armando Q.
    Barron-Martinez, Julia B.
    INTERNATIONAL JOURNAL OF LANGUAGE & COMMUNICATION DISORDERS, 2019, 54 (05) : 756 - 766