High visual resolution matters in audiovisual speech perception, but only for some

被引:18
|
作者
Alsius, Agnes [1 ]
Wayne, Rachel V. [1 ]
Pare, Martin [2 ]
Munhall, Kevin G. [1 ,2 ]
机构
[1] Queens Univ, Dept Psychol, Humphrey Hall,62 Arch St, Kingston, ON K7L 3N6, Canada
[2] Queens Univ, Ctr Neurosci Studies, Kingston, ON, Canada
基金
加拿大自然科学与工程研究理事会;
关键词
Speech perception; Audiovisual integration; Speechreading skill; Spatial frequency; Eye gaze; WORD-RECOGNITION; HEARING; INTEGRATION; NOISE; DISTANCE; INTELLIGIBILITY; DISTINCTIVENESS; COMPREHENSION; PERFORMANCE; RECEPTION;
D O I
10.3758/s13414-016-1109-4
中图分类号
B84 [心理学];
学科分类号
04 ; 0402 ;
摘要
The basis for individual differences in the degree to which visual speech input enhances comprehension of acoustically degraded speech is largely unknown. Previous research indicates that fine facial detail is not critical for visual enhancement when auditory information is available; however, these studies did not examine individual differences in ability to make use of fine facial detail in relation to audiovisual speech perception ability. Here, we compare participants based on their ability to benefit from visual speech information in the presence of an auditory signal degraded with noise, modulating the resolution of the visual signal through low-pass spatial frequency filtering and monitoring gaze behavior. Participants who benefited most from the addition of visual information (high visual gain) were more adversely affected by the removal of high spatial frequency information, compared to participants with low visual gain, for materials with both poor and rich contextual cues (i.e., words and sentences, respectively). Differences as a function of gaze behavior between participants with the highest and lowest visual gains were observed only for words, with participants with the highest visual gain fixating longer on the mouth region. Our results indicate that the individual variance in audiovisual speech in noise performance can be accounted for, in part, by better use of fine facial detail information extracted from the visual signal and increased fixation on mouth regions for short stimuli. Thus, for some, audiovisual speech perception may suffer when the visual input (in addition to the auditory signal) is less than perfect.
引用
收藏
页码:1472 / 1487
页数:16
相关论文
共 50 条
  • [21] The effect of visual spatial attention on audiovisual speech perception in adults with Asperger syndrome
    Saalasti, Satu
    Tiippana, Kaisa
    Katsyri, Jari
    Sams, Mikko
    EXPERIMENTAL BRAIN RESEARCH, 2011, 213 (2-3) : 283 - 290
  • [22] Audiovisual perception of noise vocoded speech in dyslexic and non-dyslexic adults: The role of low-frequency visual modulations
    Megnin-Viggars, Odette
    Goswami, Usha
    BRAIN AND LANGUAGE, 2013, 124 (02) : 165 - 173
  • [23] Silent articulation modulates auditory and audiovisual speech perception
    Sato, Marc
    Troille, Emilie
    Menard, Lucie
    Cathiard, Marie-Agnes
    Gracco, Vincent
    EXPERIMENTAL BRAIN RESEARCH, 2013, 227 (02) : 275 - 288
  • [24] When half a face is as good as a whole: Effects of simple substantial occlusion on visual and audiovisual speech perception
    Timothy R. Jordan
    Sharon M. Thomas
    Attention, Perception, & Psychophysics, 2011, 73 : 2270 - 2285
  • [25] Audiovisual Speech Perception in Noise in Younger and Older Bilinguals
    Chauvin, Alexandre
    Pellerin, Sophie
    Boatswain-Jacques, Anna-Francesca
    Rene, Jean-Louis
    Phillips, Natalie A.
    PSYCHOLOGY AND AGING, 2024, 39 (03) : 245 - 261
  • [26] Visual influences on speech perception in children with autism
    Iarocci, Grace
    Rombough, Adrienne
    Yager, Jodi
    Weeks, Daniel J.
    Chua, Romeo
    AUTISM, 2010, 14 (04) : 305 - 320
  • [27] Automatic audiovisual integration in speech perception
    Gentilucci, M
    Cattaneo, L
    EXPERIMENTAL BRAIN RESEARCH, 2005, 167 (01) : 66 - 75
  • [28] Perception of audiovisual infant directed speech
    Englund, Nunne
    Behne, Dawn M.
    SCANDINAVIAN JOURNAL OF PSYCHOLOGY, 2020, 61 (02) : 218 - 226
  • [29] Automatic audiovisual integration in speech perception
    Maurizio Gentilucci
    Luigi Cattaneo
    Experimental Brain Research, 2005, 167 : 66 - 75
  • [30] Increased Connectivity among Sensory and Motor Regions during Visual and Audiovisual Speech Perception
    Peelle, Jonathan E.
    Spehar, Brent
    Jones, Michael S.
    McConkey, Sarah
    Myerson, Joel
    Hale, Sandra
    Sommers, Mitchell S.
    Tye-Murray, Nancy
    JOURNAL OF NEUROSCIENCE, 2022, 42 (03): : 435 - 442