Audio-visual speech perception: a developmental ERP investigation

被引:52
作者
Knowland, Victoria C. P. [1 ,2 ]
Mercure, Evelyne [3 ]
Karmiloff-Smith, Annette [2 ]
Dick, Fred [2 ]
Thomas, Michael S. C. [2 ]
机构
[1] City Univ London, Sch Hlth Sci, London EC1V 0HB, England
[2] Univ London Birkbeck Coll, Dept Psychol Sci, London WC1E 7HX, England
[3] UCL, Inst Cognit Neurosci, London, England
基金
英国医学研究理事会;
关键词
SUPERIOR TEMPORAL SULCUS; AUDITORY-EVOKED-POTENTIALS; EVENT-RELATED POTENTIALS; VISUAL SPEECH; MULTISENSORY INTEGRATION; TALKING FACES; HUMAN BRAIN; INFORMATION; ACTIVATION; MATURATION;
D O I
10.1111/desc.12098
中图分类号
B844 [发展心理学(人类心理学)];
学科分类号
040202 ;
摘要
Being able to see a talking face confers a considerable advantage for speech perception in adulthood. However, behavioural data currently suggest that children fail to make full use of these available visual speech cues until age 8 or 9. This is particularly surprising given the potential utility of multiple informational cues during language learning. We therefore explored this at the neural level. The event-related potential (ERP) technique has been used to assess the mechanisms of audio-visual speech perception in adults, with visual cues reliably modulating auditory ERP responses to speech. Previous work has shown congruence-dependent shortening of auditory N1/P2 latency and congruence-independent attenuation of amplitude in the presence of auditory and visual speech signals, compared to auditory alone. The aim of this study was to chart the development of these well-established modulatory effects over mid-to-late childhood. Experiment 1 employed an adult sample to validate a child-friendly stimulus set and paradigm by replicating previously observed effects of N1/P2 amplitude and latency modulation by visual speech cues; it also revealed greater attenuation of component amplitude given incongruent audio-visual stimuli, pointing to a new interpretation of the amplitude modulation effect. Experiment 2 used the same paradigm to map cross-sectional developmental change in these ERP responses between 6 and 11years of age. The effect of amplitude modulation by visual cues emerged over development, while the effect of latency modulation was stable over the child sample. These data suggest that auditory ERP modulation by visual speech represents separable underlying cognitive processes, some of which show earlier maturation than others over the course of development.
引用
收藏
页码:110 / 124
页数:15
相关论文
共 75 条
  • [1] The ventriloquist effect results from near-optimal bimodal integration
    Alais, D
    Burr, D
    [J]. CURRENT BIOLOGY, 2004, 14 (03) : 257 - 262
  • [2] Spatiotemporal dynamics of audiovisual speech processing
    Bernstein, Lynne E.
    Auer, Edward T., Jr.
    Wagner, Michael
    Ponton, Curtis W.
    [J]. NEUROIMAGE, 2008, 39 (01) : 423 - 435
  • [3] Bimodal speech: early suppressive visual effects in human auditory cortex
    Besle, J
    Fort, A
    Delpuech, C
    Giard, MH
    [J]. EUROPEAN JOURNAL OF NEUROSCIENCE, 2004, 20 (08) : 2225 - 2234
  • [4] Electrophysiological (EEG, sEEG, MEG) evidence for multiple audiovisual interactions in the human auditory cortex
    Besle, Julien
    Bertrand, Olivier
    Giard, Marie-Helene
    [J]. HEARING RESEARCH, 2009, 258 (1-2) : 143 - 151
  • [5] Visual Activation and Audiovisual Interactions in the Auditory Cortex during Speech Perception: Intracranial Recordings in Humans
    Besle, Julien
    Fischer, Catherine
    Bidet-Caulet, Aurelie
    Lecaignard, Francoise
    Bertrand, Olivier
    Giard, Marie-Helene
    [J]. JOURNAL OF NEUROSCIENCE, 2008, 28 (52) : 14301 - 14310
  • [6] Maturation of the long-latency auditory ERP: step function changes at start and end of adolescence
    Bishop, Dorothy V. M.
    Hardiman, Mervyn
    Uwer, Ruth
    von Suchodoletz, Waldemar
    [J]. DEVELOPMENTAL SCIENCE, 2007, 10 (05) : 565 - 575
  • [7] Hearing Faces: How the Infant Brain Matches the Face It Sees with the Speech It Hears
    Bristow, Davina
    Dehaene-Lambertz, Ghislaine
    Mattout, Jeremie
    Soares, Catherine
    Gliga, Teodora
    Baillet, Sylvain
    Mangin, Jean-Francois
    [J]. JOURNAL OF COGNITIVE NEUROSCIENCE, 2009, 21 (05) : 905 - 921
  • [8] Auditory-visual speech integration by prelinguistic infants: Perception of an emergent consonant in the McGurk effect
    Burnham, D
    Dodd, B
    [J]. DEVELOPMENTAL PSYCHOBIOLOGY, 2004, 45 (04) : 204 - 220
  • [9] Neural correlates of cross-modal binding
    Bushara, KO
    Hanakawa, T
    Immisch, I
    Toma, K
    Kansaku, K
    Hallett, M
    [J]. NATURE NEUROSCIENCE, 2003, 6 (02) : 190 - 195
  • [10] Multisensory integration sites identified by perception of spatial wavelet filtered visual speech gesture information
    Callan, DE
    Jones, JA
    Munhall, K
    Kroos, C
    Callan, AM
    Vatikiotis-Bateson, E
    [J]. JOURNAL OF COGNITIVE NEUROSCIENCE, 2004, 16 (05) : 805 - 816