Sound representing self-motion in virtual environments enhances linear vection

被引:32
作者
Valjamae, Aleksander [1 ]
Larsson, Pontus [1 ]
Vastfjall, Daniel [1 ]
Kleiner, Mendel [1 ]
机构
[1] Chalmers, Div Appl Acoust, S-41296 Gothenburg, Sweden
关键词
D O I
10.1162/pres.17.1.43
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Sound is an important, but often neglected, component for creating a self-motion illusion (vection) in Virtual Reality applications, for example, motion simulators. Apart from auditory motion cues, sound can provide contextual information representing self-motion in a virtual environment. In two experiments we investigated the benefits of hearing an engine sound when presenting auditory (Experiment 1) or auditory-vibrotactile (Experiment 2) virtual environments inducing linear vection. The addition of the engine sound to the auditory scene significantly enhanced subjective ratings of vection intensity in Experiment 1 and vection onset times but not subjective ratings in Experiment 2. Further analysis using individual imagery vividness scores showed that this disparity between vection measures was created by participants with higher kinesthetic imagery. On the other hand, for participants with lower kinesthetic imagery scores, the engine sound enhanced vection sensation in both experiments. A high correlation with participants' kinesthetic imagery vividness scores suggests the influence of a first person perspective in the perception of the engine sound. We hypothesize that self-motion sounds (e.g., the sound of footsteps, engine sound) represent a specific type of acoustic body-centered feedback in virtual environments. Therefore, the results may contribute to a better understanding of the role of self-representation sounds (sonic self-avatars), in virtual and augmented environments.
引用
收藏
页码:43 / 56
页数:14
相关论文
共 42 条
[1]   PERCEPTION OF SELF-MOTION - PSYCHOPHYSICAL AND COMPUTATIONAL APPROACHES [J].
ANDERSEN, GJ .
PSYCHOLOGICAL BULLETIN, 1986, 99 (01) :52-65
[2]  
[Anonymous], DISTRIBUTION FUNCTIO
[3]   Auditory motion affects visual biological motion processing [J].
Brooks, A. ;
van der Zwan, R. ;
Billard, A. ;
Petreska, B. ;
Clarke, S. ;
Blanke, O. .
NEUROPSYCHOLOGIA, 2007, 45 (03) :523-530
[4]   What's real about virtual reality? [J].
Brooks, FP .
IEEE COMPUTER GRAPHICS AND APPLICATIONS, 1999, 19 (06) :16-27
[5]   The relationship between the use of kinaesthetic imagery and different visual imagery perspectives [J].
Callow, N ;
Hardy, L .
JOURNAL OF SPORTS SCIENCES, 2004, 22 (02) :167-177
[6]  
Chion Michel., 1994, Audio Vision: Sound on Screen
[7]  
COOK PR, 2004, SYNTHESIS TOOLKIT C
[8]   fMRI signal increases and decreases in cortical areas during small-field optokinetic stimulation and central fixation [J].
Dieterich, M ;
Bense, S ;
Stephan, T ;
Yousry, TA ;
Brandt, T .
EXPERIMENTAL BRAIN RESEARCH, 2003, 148 (01) :117-127
[9]   Humans integrate visual and haptic information in a statistically optimal fashion [J].
Ernst, MO ;
Banks, MS .
NATURE, 2002, 415 (6870) :429-433
[10]   Using behavioral realism to estimate presence: A study of the utility of postural responses to motion stimuli [J].
Freeman, J ;
Avons, SE ;
Meddis, R ;
Pearson, DE ;
IJsselsteijn, WI .
PRESENCE-TELEOPERATORS AND VIRTUAL ENVIRONMENTS, 2000, 9 (02) :149-164