Natural, Metaphoric, and Linguistic Auditory Direction Signals Have Distinct Influences on Visual Motion Processing

被引:92
作者
Sadaghiani, Sepideh [1 ]
Maier, Joost X. [1 ]
Noppeney, Uta [1 ]
机构
[1] Max Planck Inst Biol Cybernet, D-72076 Tubingen, Germany
关键词
CROSSMODAL SPATIAL ATTENTION; VENTRAL INTRAPARIETAL AREA; HUMAN PARIETAL CORTEX; HUMAN BRAIN; COGNITIVE CONTROL; MULTISENSORY INTERACTIONS; PSYCHOMETRIC FUNCTION; SENSORY MODALITIES; NEURAL MECHANISMS; LOW-LEVEL;
D O I
10.1523/JNEUROSCI.5437-08.2009
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
To interact with our dynamic environment, the brain merges motion information from auditory and visual senses. However, not only "natural" auditory MOTION, but also "metaphoric" de/ascending PITCH and SPEECH (e. g., "left/right"), influence the visual motion percept. Here, we systematically investigate whether these three classes of direction signals influence visual motion perception through shared or distinct neural mechanisms. In a visual-selective attention paradigm, subjects discriminated the direction of visual motion at several levels of reliability, with an irrelevant auditory stimulus being congruent, absent, or incongruent. Although the natural, metaphoric, and linguistic auditory signals were equally long and adjusted to induce a comparable directional bias on the motion percept, they influenced visual motion processing at different levels of the cortical hierarchy. A significant audiovisual interaction was revealed for MOTION in left human motion complex (hMT + /V5 +) and for SPEECH in right intraparietal sulcus. In fact, the audiovisual interaction gradually decreased in left hMT + /V5 + for MOTION + PITCH + SPEECH and in right intraparietal sulcus for SPEECH + PITCH + MOTION. In conclusion, natural motion signals are integrated in audiovisual motion areas, whereas the influence of culturally learnt signals emerges primarily in higher-level convergence regions.
引用
收藏
页码:6490 / 6499
页数:10
相关论文
共 63 条
[51]   Cross-modal dynamic capture: Congruency effects in the perception of motion across sensory modalities [J].
Soto-Faraco, S ;
Spence, C ;
Kingstone, A .
JOURNAL OF EXPERIMENTAL PSYCHOLOGY-HUMAN PERCEPTION AND PERFORMANCE, 2004, 30 (02) :330-345
[52]   The ventriloquist in motion: Illusory capture of dynamic information across sensory modalities [J].
Soto-Faraco, S ;
Lyons, J ;
Gazzaniga, M ;
Spence, C ;
Kingstone, A .
COGNITIVE BRAIN RESEARCH, 2002, 14 (01) :139-146
[53]   Changing channels: An fMRI study of aging and cross-modal attention shifts [J].
Townsend, J ;
Adamo, M ;
Haist, F .
NEUROIMAGE, 2006, 31 (04) :1682-1692
[54]  
Ungerleider L, 1982, ANAL VISUAL BEHAV, P549, DOI DOI 10.2139/SSRN.1353746
[55]   Integration of letters and speech sounds in the human brain [J].
van Atteveldt, N ;
Formisano, E ;
Goebel, R ;
Blomert, L .
NEURON, 2004, 43 (02) :271-282
[56]  
WAGNER S, 1981, CHILD DEV, V52, P728, DOI 10.1111/j.1467-8624.1981.tb03106.x
[57]  
Warren JD, 2003, J NEUROSCI, V23, P5799
[58]   Perception of sound-source motion by the human brain [J].
Warren, JD ;
Zielinski, BA ;
Green, GGR ;
Rauschecker, JP ;
Griffiths, TD .
NEURON, 2002, 34 (01) :139-148
[59]   The neural mechanisms for minimizing cross-modal distraction [J].
Weissman, DH ;
Warner, LM ;
Woldorff, MG .
JOURNAL OF NEUROSCIENCE, 2004, 24 (48) :10941-10949
[60]   The psychometric function: II. Bootstrap-based confidence intervals and sampling [J].
Wichmann, FA ;
Hill, NJ .
PERCEPTION & PSYCHOPHYSICS, 2001, 63 (08) :1314-1329