A link between individual differences in multisensory speech perception and eye movements

被引:0
作者
Demet Gurler
Nathan Doyle
Edgar Walker
John Magnotti
Michael Beauchamp
机构
[1] University of Alabama at Birmingham,Department of Radiology
[2] University of Texas Health Science Center at Houston,Department of Neurobiology and Anatomy
[3] Baylor College of Medicine,Department of Neuroscience
[4] Baylor College of Medicine,Department of Neurosurgery
来源
Attention, Perception, & Psychophysics | 2015年 / 77卷
关键词
Eye movements; Cognitive; Speech perception; Multisensory processing;
D O I
暂无
中图分类号
学科分类号
摘要
The McGurk effect is an illusion in which visual speech information dramatically alters the perception of auditory speech. However, there is a high degree of individual variability in how frequently the illusion is perceived: some individuals almost always perceive the McGurk effect, while others rarely do. Another axis of individual variability is the pattern of eye movements make while viewing a talking face: some individuals often fixate the mouth of the talker, while others rarely do. Since the talker's mouth carries the visual speech necessary information to induce the McGurk effect, we hypothesized that individuals who frequently perceive the McGurk effect should spend more time fixating the talker's mouth. We used infrared eye tracking to study eye movements as 40 participants viewed audiovisual speech. Frequent perceivers of the McGurk effect were more likely to fixate the mouth of the talker, and there was a significant correlation between McGurk frequency and mouth looking time. The noisy encoding of disparity model of McGurk perception showed that individuals who frequently fixated the mouth had lower sensory noise and higher disparity thresholds than those who rarely fixated the mouth. Differences in eye movements when viewing the talker’s face may be an important contributor to interindividual differences in multisensory speech perception.
引用
收藏
页码:1333 / 1341
页数:8
相关论文
共 83 条
  • [1] Alsius A(2005)Audiovisual integration of speech falters under high attention demands Current Biology 15 839-843
  • [2] Navarra J(2007)Attention to touch weakens audiovisual speech integration Experimental Brain Research 183 399-404
  • [3] Campbell R(2012)Start position strongly influences fixation patterns during face processing: Difficulties with eye movements as a measure of information use PloS One 7 e31106-1192
  • [4] Soto-Faraco S(2004)Unraveling multisensory integration: Patchy organization within human STS multisensory cortex Nature Neuroscience 7 1190-823
  • [5] Alsius A(2004)Integration of auditory and visual information about objects in superior temporal sulcus Neuron 41 809-2417
  • [6] Navarra J(2010)fMRI-Guided transcranial magnetic stimulation reveals that the superior temporal sulcus is a cortical locus of the McGurk effect Journal of Neuroscience 30 2414-13
  • [7] Soto-Faraco S(2007)Spatial statistics of gaze fixations during dynamic face processing Social Neuroscience 2 1-106
  • [8] Arizpe J(2012)The effect of a concurrent working memory task and temporal offsets on the integration of auditory and visual speech information Seeing and Perceiving 25 87-504
  • [9] Kravitz DJ(2003)Human gaze control during real-world scene perception Trends in Cognitive Sciences 7 498-11
  • [10] Yovel G(2004)Auditory-visual speech perception in older people: The effect of visual acuity The Australian and New Zealand Journal of Audiology 26 3-539