Language-driven anticipatory eye movements in virtual reality

被引:17
作者
Eichert, Nicole [1 ,2 ]
Peeters, David [1 ]
Hagoort, Peter [1 ,3 ]
机构
[1] Max Planck Inst Psycholinguist, Nijmegen, Netherlands
[2] Univ Oxford, Oxford, England
[3] Radboud Univ Nijmegen, Donders Inst Brain Cognit & Behav, Nijmegen, Netherlands
关键词
Virtual Reality; Prediction; Language Comprehension; Eyetracking; Visual World; SPOKEN LANGUAGE; TIME-COURSE; PREDICTION; FIXATION; INFORMATION; SPEECH; WORLD; INTEGRATION; PICTURES; MODELS;
D O I
10.3758/s13428-017-0929-z
中图分类号
B841 [心理学研究方法];
学科分类号
040201 ;
摘要
Predictive language processing is often studied by measuring eye movements as participants look at objects on a computer screen while they listen to spoken sentences. This variant of the visual-world paradigm has revealed that information encountered by a listener at a spoken verb can give rise to anticipatory eye movements to a target object, which is taken to indicate that people predict upcoming words. The ecological validity of such findings remains questionable, however, because these computer experiments used two-dimensional stimuli that were mere abstractions of real-world objects. Here we present a visual-world paradigm study in a three-dimensional (3-D) immersive virtual reality environment. Despite significant changes in the stimulus materials and the different mode of stimulus presentation, language-mediated anticipatory eye movements were still observed. These findings thus indicate that people do predict upcoming words during language comprehension in a more naturalistic setting where natural depth cues are preserved. Moreover, the results confirm the feasibility of using eyetracking in rich and multimodal 3-D virtual environments.
引用
收藏
页码:1102 / 1115
页数:14
相关论文
共 50 条
  • [1] Language-driven anticipatory eye movements in virtual reality
    Nicole Eichert
    David Peeters
    Peter Hagoort
    Behavior Research Methods, 2018, 50 : 1102 - 1115
  • [2] Effect of repetition proportion on language-driven anticipatory eye movements
    Britt, Allison E.
    Mirman, Daniel
    Kornilov, Sergey A.
    Magnuson, James S.
    ACTA PSYCHOLOGICA, 2014, 145 : 128 - 138
  • [3] Linguistically guided anticipatory eye movements in scene viewing
    Staub, Adrian
    Abbott, Matthew
    Bogartz, Richard S.
    VISUAL COGNITION, 2012, 20 (08) : 922 - 946
  • [4] Modulation of scene consistency and task demand on language-driven eye movements for audio-visual integration
    Yu, Wan-Yun
    Tsai, Jie-Li
    ACTA PSYCHOLOGICA, 2016, 171 : 1 - 16
  • [5] Vergence eye movements in virtual reality
    McAnally, Ken
    Grove, Philip
    Wallis, Guy
    DISPLAYS, 2024, 83
  • [6] Visual context constrains language-mediated anticipatory eye movements
    Hintz, Florian
    Meyer, Antje S.
    Huettig, Falk
    QUARTERLY JOURNAL OF EXPERIMENTAL PSYCHOLOGY, 2020, 73 (03) : 458 - 467
  • [7] Prediction of upcoming speech under fluent and disfluent conditions: eye tracking evidence from immersive virtual reality
    Huizeling, Eleanor
    Peeters, David
    Hagoort, Peter
    LANGUAGE COGNITION AND NEUROSCIENCE, 2022, 37 (04) : 481 - 508
  • [8] Verbal Semantics Drives Early Anticipatory Eye Movements during the Comprehension of Verb-Initial Sentences
    Sauppe, Sebastian
    FRONTIERS IN PSYCHOLOGY, 2016, 7
  • [9] Reinforcement effects in anticipatory smooth eye movements
    Damasse, Jean-Bernard
    Perrinet, Laurent U.
    Madelain, Laurent
    Montagnini, Anna
    JOURNAL OF VISION, 2018, 18 (11): : 1 - 18
  • [10] Effects of speech rate on anticipatory eye movements in the visual world paradigm: Evidence from aging, native, and non-native language processing
    Fernandez, Leigh B.
    Engelhardt, Paul E.
    Patarroyo, Angela G.
    Allen, Shanley E. M.
    QUARTERLY JOURNAL OF EXPERIMENTAL PSYCHOLOGY, 2020, 73 (12) : 2348 - 2361