Spatial working memory for locations specified by vision and audition: Testing the amodality hypothesis

被引:25
作者
Loomis, Jack M. [1 ]
Klatzky, Roberta L. [2 ]
McHugh, Brendan [1 ]
Giudice, Nicholas A. [3 ]
机构
[1] Univ Calif Santa Barbara, Dept Psychol & Brain Sci, Santa Barbara, CA 93106 USA
[2] Carnegie Mellon Univ, Dept Psychol, Pittsburgh, PA 15213 USA
[3] Univ Maine, Spatial Informat Program, Sch Comp & Informat Sci, Orono, ME 04469 USA
关键词
Working memory; Visual perception; Audition; AUDITORY DISTANCE PERCEPTION; DIRECTED ACTION; 3-D SOUND; LANGUAGE; REPRESENTATIONS; HUMANS; CORTEX; INTEGRATION; ACTIVATION; KNOWLEDGE;
D O I
10.3758/s13414-012-0311-2
中图分类号
B84 [心理学];
学科分类号
04 ; 0402 ;
摘要
Spatial working memory can maintain representations from vision, hearing, and touch, representations referred to here as spatial images. The present experiment addressed whether spatial images from vision and hearing that are simultaneously present within working memory retain modality-specific tags or are amodal. Observers were presented with short sequences of targets varying in angular direction, with the targets in a given sequence being all auditory, all visual, or a sequential mixture of the two. On two thirds of the trials, one of the locations was repeated, and observers had to respond as quickly as possible when detecting this repetition. Ancillary detection and localization tasks confirmed that the visual and auditory targets were perceptually comparable. Response latencies in the working memory task showed small but reliable costs in performance on trials involving a sequential mixture of auditory and visual targets, as compared with trials of pure vision or pure audition. These deficits were statistically reliable only for trials on which the modalities of the matching location switched from the penultimate to the final target in the sequence, indicating a switching cost. The switching cost for the pair in immediate succession means that the spatial images representing the target locations retain features of the visual or auditory representations from which they were derived. However, there was no reliable evidence of a performance cost for mixed modalities in the matching pair when the second of the two did not immediately follow the first, suggesting that more enduring spatial images in working memory may be amodal.
引用
收藏
页码:1260 / 1267
页数:8
相关论文
共 40 条
[1]   HUMAN PERIPHERAL SPATIAL-RESOLUTION FOR ACHROMATIC AND CHROMATIC STIMULI - LIMITS IMPOSED BY OPTICAL AND RETINAL FACTORS [J].
ANDERSON, SJ ;
MULLEN, KT ;
HESS, RF .
JOURNAL OF PHYSIOLOGY-LONDON, 1991, 442 :47-64
[2]   CONTRIBUTION OF LISTENERS APPROACHING MOTION TO AUDITORY DISTANCE PERCEPTION [J].
ASHMEAD, DH ;
DAVIS, DL ;
NORTHINGTON, A .
JOURNAL OF EXPERIMENTAL PSYCHOLOGY-HUMAN PERCEPTION AND PERFORMANCE, 1995, 21 (02) :239-256
[3]   Functional equivalence of spatial representations derived from vision and language: Evidence from allocentric judgments [J].
Avraamides, MN ;
Loomis, JM ;
Klatzky, RL ;
Golledge, RG .
JOURNAL OF EXPERIMENTAL PSYCHOLOGY-LEARNING MEMORY AND COGNITION, 2004, 30 (04) :801-814
[4]   Overlapping mechanisms of attention and spatial working memory [J].
Awh, E ;
Jonides, J .
TRENDS IN COGNITIVE SCIENCES, 2001, 5 (03) :119-126
[5]  
Baddeley A.D., 1980, ATTENTION PERFORM, P521, DOI DOI 10.4324/9781315793252
[6]  
Blauert J., 1997, Spatial hearing: the psychophysics of human sound localization
[7]   Visual signals bias auditory targets in azimuth and depth [J].
Bowen, Amanda L. ;
Ramachandran, Ramnarayan ;
Muday, Jeffrey A. ;
Schirillo, James A. .
EXPERIMENTAL BRAIN RESEARCH, 2011, 214 (03) :403-414
[8]   Representing space in language and perception [J].
Bryant, DJ .
MIND & LANGUAGE, 1997, 12 (3-4) :239-264
[9]   Spatial memory: how egocentric and allocentric combine [J].
Burgess, Neil .
TRENDS IN COGNITIVE SCIENCES, 2006, 10 (12) :551-557
[10]   Modality-specific frontal and parietal areas for auditory and visual spatial localization in humans [J].
Bushara, KO ;
Weeks, RA ;
Ishii, K ;
Catalan, MJ ;
Tian, B ;
Rauschecker, JP ;
Hallett, M .
NATURE NEUROSCIENCE, 1999, 2 (08) :759-766