Hand gestures as visual prosody: BOLD responses to audio-visual alignment are modulated by the communicative nature of the stimuli

被引:24
作者
Biau, Emmanuel [1 ]
Moris Fernandez, Luis [1 ]
Holle, Henning [3 ]
Avila, Cesar [4 ]
Soto-Faraco, Salvador [1 ,2 ]
机构
[1] Univ Pompeu Fabra, Multisensory Res Grp, Ctr Brain & Cognit, Barcelona 08018, Spain
[2] ICREA, Barcelona, Spain
[3] Univ Hull, Dept Psychol, Kingston Upon Hull HU6 7RX, N Humberside, England
[4] Univ Jaume 1, Dept Psychol, Castellon de La Plana, Spain
基金
欧洲研究理事会;
关键词
Speech perception; Gestures; Audiovisual speech; Multisensory Integration; MTG; fMRI; SUPERIOR TEMPORAL SULCUS; CO-SPEECH GESTURES; BIOLOGICAL MOTION; BEAT GESTURES; NEURAL BASIS; BROCAS AREA; INTEGRATION; BRAIN; PERCEPTION; REGIONS;
D O I
10.1016/j.neuroimage.2016.02.018
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
During public addresses, speakers accompany their discourse with spontaneous hand gestures (beats) that are tightly synchronized with the prosodic contour of the discourse. It has been proposed that speech and beat gestures originate from a common underlying linguistic process whereby both speech prosody and beats serve to emphasize relevant information. We hypothesized that breaking the consistency between beats and prosody by temporal desynchronization, would modulate activity of brain areas sensitive to speech-gesture integration. To this aim, we measured BOLD responses as participants watched a natural discourse where the speaker used beat gestures. In order to identify brain areas specifically involved in processing hand gestures with communicative intention, beat synchrony was evaluated against arbitrary visual cues bearing equivalent rhythmic and spatial properties as the gestures. Our results revealed that left MTG and IFG were specifically sensitive to speech synchronized with beats, compared to the arbitrary vision-speech pairing. Our results suggest that listeners confer beats a function of visual prosody, complementary to the prosodic structure of speech. We conclude that the emphasizing function of beat gestures in speech perception is instantiated through a specialized brain network sensitive to the communicative intent conveyed by a speaker with his/her hands. (C) 2016 Elsevier Inc. All rights reserved.
引用
收藏
页码:129 / 137
页数:9
相关论文
共 50 条
  • [1] Social perception from visual cues: role of the STS region
    Allison, T
    Puce, A
    McCarthy, G
    [J]. TRENDS IN COGNITIVE SCIENCES, 2000, 4 (07) : 267 - 278
  • [2] [Anonymous], P LANG RES EV C
  • [3] Speaker's hand gestures modulate speech perception through phase resetting of ongoing neural oscillations
    Biau, Emmanuel
    Torralba, Mireia
    Fuentemilla, Lluis
    Balaguer, Ruth de Diego
    Soto-Faraco, Salvador
    [J]. CORTEX, 2015, 68 : 76 - 85
  • [4] Beat gestures modulate auditory integration in speech perception
    Biau, Emmanuel
    Soto-Faraco, Salvador
    [J]. BRAIN AND LANGUAGE, 2013, 124 (02) : 143 - 152
  • [5] Phonetic perceptual identification by native- and second-language speakers differentially activates brain regions involved with acoustic phonetic processing and those involved with articulatory-auditory/orosensory internal models
    Callan, DE
    Jones, JA
    Callan, AM
    Akahane-Yamada, R
    [J]. NEUROIMAGE, 2004, 22 (03) : 1182 - 1194
  • [6] Evidence from functional magnetic resonance imaging of crossmodal binding in the human heteromodal cortex
    Calvert, GA
    Campbell, R
    Brammer, MJ
    [J]. CURRENT BIOLOGY, 2000, 10 (11) : 649 - 657
  • [7] The processing of audio-visual speech: empirical and neural bases
    Campbell, Ruth
    [J]. PHILOSOPHICAL TRANSACTIONS OF THE ROYAL SOCIETY B-BIOLOGICAL SCIENCES, 2008, 363 (1493) : 1001 - 1010
  • [8] Good and Bad in the Hands of Politicians: Spontaneous Gestures during Positive and Negative Speech
    Casasanto, Daniel
    Jasmin, Kyle
    [J]. PLOS ONE, 2010, 5 (07): : 1 - 5
  • [9] Frontal and temporal contributions to understanding the iconic co-speech gestures that accompany speech
    Dick, Anthony Steven
    Mok, Eva H.
    Beharelle, Anjali Raja
    Goldin-Meadow, Susan
    Small, Steven L.
    [J]. HUMAN BRAIN MAPPING, 2014, 35 (03) : 900 - 917
  • [10] Co-Speech Gestures Influence Neural Activity in Brain Regions Associated With Processing Semantic Information
    Dick, Anthony Steven
    Goldin-Meadow, Susan
    Hasson, Uri
    Skipper, Jeremy I.
    Small, Steven L.
    [J]. HUMAN BRAIN MAPPING, 2009, 30 (11) : 3509 - 3526