Perceptual congruency of audio-visual speech affects ventriloquism with bilateral visual stimuli

被引:0
作者
Shoko Kanaya
Kazuhiko Yokosawa
机构
[1] The University of Tokyo,Department of Psychology Graduate School of Humanities and Sociology
来源
Psychonomic Bulletin & Review | 2011年 / 18卷
关键词
Ventriloquism; Speech perception; Multisensory integration;
D O I
暂无
中图分类号
学科分类号
摘要
Many studies on multisensory processes have focused on performance in simplified experimental situations, with a single stimulus in each sensory modality. However, these results cannot necessarily be applied to explain our perceptual behavior in natural scenes where various signals exist within one sensory modality. We investigated the role of audio-visual syllable congruency on participants’ auditory localization bias or the ventriloquism effect using spoken utterances and two videos of a talking face. Salience of facial movements was also manipulated. Results indicated that more salient visual utterances attracted participants’ auditory localization. Congruent pairing of audio-visual utterances elicited greater localization bias than incongruent pairing, while previous studies have reported little dependency on the reality of stimuli in ventriloquism. Moreover, audio-visual illusory congruency, owing to the McGurk effect, caused substantial visual interference on auditory localization. Multisensory performance appears more flexible and adaptive in this complex environment than in previous studies.
引用
收藏
页码:123 / 128
页数:5
相关论文
共 40 条
[1]  
Bertelson P(1998)Automatic visual bias of perceived auditory location Psychonomic Bulletin & Review 5 482-489
[2]  
Aschersleben G(2000)The ventriloquist effect does not depend on the direction of deliberate visual attention Perception & Psychophysics 62 321-332
[3]  
Bertelson P(1994)Exploring the relation between McGurk interference and ventriloquism Proceedings of the International Conference on Spoken Language Processing 2 559-562
[4]  
Vroomen J(2005)Audio and visual cues in a two-talker divided attention speech-monitoring task. Human factors The Journal of the Human Factors and Ergonomics Society 47 562-573
[5]  
de Gelder B(2001)Rules of intersensory integration in spatial scene analysis and speech reading Psychologica Belgica 41 131-144
[6]  
Driver J(1996)Enhancement of selective listening by illusory mislocation of speech sounds due to lip-reading Nature 381 66-68
[7]  
Bertelson P(1973)Effects of degree of visual association and angle of displacement on the "ventriloquism" effect Perceptual and Motor Skills 37 967-979
[8]  
Vroomen J(1976)Hearing lips and seeing voices Nature 264 746-748
[9]  
Wiegeraad G(2003)Gaze behavior in audiovisual speech perception: The influence of ocular fixations on the McGurk effect Perception & Psychophysics 65 553-567
[10]  
de Gelder B(1985)Signal intensity, task context, and auditory-visual interaction Perception 14 571-577