Auditory Stimulus Detection Partially Depends on Visuospatial Attentional Resources

被引:17
作者
Wahn, Basil [1 ]
Murali, Supriya [2 ]
Sinnett, Scott [3 ]
Koenig, Peter [4 ,5 ]
机构
[1] Univ Osnabruck, Inst Cognit Sci, Neurobiopsychol Lab, Osnabruck, Germany
[2] Univ Osnabruck, Inst Cognit Sci, Wachsbleiche 27, D-49090 Osnabruck, Germany
[3] Univ Hawaii Manoa, Dept Psychol, Honolulu, HI 96822 USA
[4] Univ Osnabruck, Inst Cognit Sci, Neurobiopsychol, Osnabruck, Germany
[5] Univ Med Ctr Hamburg Eppendorf, Dept Neurophysiol & Pathophysiol, Hamburg, Germany
基金
欧洲研究理事会;
关键词
attentional resources; multisensory processing; vision; audition; multiple object tracking; multisensory integration; load theory; VISUAL DOMINANCE; INFORMATION; INTEGRATION; VISION; TOUCH; PERFORMANCE; BLINK;
D O I
10.1177/2041669516688026
中图分类号
B84 [心理学];
学科分类号
04 ; 0402 ;
摘要
Humans' ability to detect relevant sensory information while being engaged in a demanding task is crucial in daily life. Yet, limited attentional resources restrict information processing. To date, it is still debated whether there are distinct pools of attentional resources for each sensory modality and to what extent the process of multisensory integration is dependent on attentional resources. We addressed these two questions using a dual task paradigm. Specifically, participants performed a multiple object tracking task and a detection task either separately or simultaneously. In the detection task, participants were required to detect visual, auditory, or audiovisual stimuli at varying stimulus intensities that were adjusted using a staircase procedure. We found that tasks significantly interfered. However, the interference was about 50% lower when tasks were performed in separate sensory modalities than in the same sensory modality, suggesting that attentional resources are partly shared. Moreover, we found that perceptual sensitivities were significantly improved for audiovisual stimuli relative to unisensory stimuli regardless of whether attentional resources were diverted to the multiple object tracking task or not. Overall, the present study supports the view that attentional resource allocation in multisensory processing is task-dependent and suggests that multisensory benefits are not dependent on attentional resources.
引用
收藏
页数:18
相关论文
共 59 条
[1]   Separate attentional resources for vision and audition [J].
Alais, D ;
Morrone, C ;
Burr, D .
PROCEEDINGS OF THE ROYAL SOCIETY B-BIOLOGICAL SCIENCES, 2006, 273 (1592) :1339-1345
[2]   Audiovisual integration of speech falters under high attention demands [J].
Alsius, A ;
Navarra, J ;
Campbell, R ;
Soto-Faraco, S .
CURRENT BIOLOGY, 2005, 15 (09) :839-843
[3]   Attention to touch weakens audiovisual speech integration [J].
Alsius, Agnes ;
Navarra, Jordi ;
Soto-Faraco, Salvador .
EXPERIMENTAL BRAIN RESEARCH, 2007, 183 (03) :399-404
[4]  
Alsius Agnes., 2014, Frontiers in Psychology, V5, P1, DOI [10.3389/fpsyg.2014.00727, DOI 10.3389/FPSYG.2014.00727]
[5]   How many objects can you track? Evidence for a resource-limited attentive tracking mechanism [J].
Alvarez, George A. ;
Franconeri, Steven L. .
JOURNAL OF VISION, 2007, 7 (13)
[6]  
[Anonymous], 1991, SURVEY RES
[7]   Vision and audition do not share attentional resources in sustained tasks [J].
Arrighi, Roberto ;
Lunardi, Roy ;
Burr, David .
FRONTIERS IN PSYCHOLOGY, 2011, 2
[8]   The ventriloquist effect does not depend on the direction of deliberate visual attention [J].
Bertelson, P ;
Vroomen, J ;
de Gelder, B ;
Driver, J .
PERCEPTION & PSYCHOPHYSICS, 2000, 62 (02) :321-332
[9]   Tracking multiple targets with multifocal attention [J].
Cavanagh, P ;
Alvarez, GA .
TRENDS IN COGNITIVE SCIENCES, 2005, 9 (07) :349-354
[10]   Behavioral evidence for task-dependent "what" versus "where" processing within and across modalities [J].
Chan, Jason S. ;
Newell, Fiona N. .
PERCEPTION & PSYCHOPHYSICS, 2008, 70 (01) :36-49