Influence of head tracking on the externalization of speech stimuli for non-individualized binaural synthesis

被引:46
作者
Hendrickx, Etienne [1 ]
Stitt, Peter [2 ]
Messonnier, Jean-Christophe [1 ]
Lyzwa, Jean-Marc [1 ]
Katz, Brian F. G. [3 ]
de Boisheraud, Catherine [1 ]
机构
[1] Conservatoire Natl Super Mus & Danse Paris, 209 Ave Jean Jaures, F-75019 Paris, France
[2] Univ Paris Saclay, CNRS, Lab Informat Mecan & Sci Ingn, Audio Acoust Grp, F-91405 Orsay, France
[3] Univ Paris 06, Univ Pierre & Marie Curie, Sorbonne Univ, CNRS,Inst Alembert, F-75005 Paris, France
关键词
VIRTUAL AUDIO; SOUND IMAGES; LOCALIZATION; REVERBERATION; REPRODUCTION; RECORDINGS; SIMULATION; LISTENERS; ACCURACY; MOVEMENT;
D O I
10.1121/1.4978612
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
Binaural reproduction aims at recreating a realistic audio scene at the ears of the listener using headphones. In the real acoustic world, sound sources tend to be externalized (that is, perceived to be emanating from a source out in the world) rather than internalized (that is, perceived to be emanating from inside the head). Unfortunately, several studies report a collapse of externalization, especially with frontal and rear virtual sources, when listening to binaural content using non-individualized Head-Related Transfer Functions (HRTFs). The present study examines whether or not head movements coupled with a head tracking device can compensate for this collapse. For each presentation, a speech stimulus was presented over headphones at different azimuths, using several intermixed sets of non-individualized HRTFs for the binaural rendering. The head tracker could either be active or inactive, and the subjects could either be asked to rotate their heads or to keep them as stationary as possible. After each presentation, subjects reported to what extent the stimulus had been externalized. In contrast to several previous studies, results showed that head movements can substantially enhance externalization, especially for frontal and rear sources, and that externalization can persist once the subject has stopped moving his/her head. (C) 2017 Acoustical Society of America.
引用
收藏
页码:2011 / 2023
页数:13
相关论文
共 42 条
[1]  
Baskind A., 2012, 27 TONM VDT INT CONV
[2]   HEADPHONE LOCALIZATION OF SPEECH [J].
BEGAULT, DR ;
WENZEL, EM .
HUMAN FACTORS, 1993, 35 (02) :361-376
[3]  
Begault DR, 2001, J AUDIO ENG SOC, V49, P904
[4]  
BEGAULT DR, 1992, J AUDIO ENG SOC, V40, P895
[5]   LOCALIZATION AND LAW OF FIRST WAVEFRONT IN MEDIAN PLANE [J].
BLAUERT, J .
JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA, 1971, 50 (02) :466-&
[6]  
Blauert J., 1997, SPATIAL HEARING PSYC, P222
[7]   Auditory externalization in hearing-impaired listeners: The effect of pinna cues and number of talkers [J].
Boyd, Alan W. ;
Whitmer, William M. ;
Soraghan, John J. ;
Akeroyd, Michael A. .
JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA, 2012, 131 (03) :EL268-EL274
[8]   The Contribution of Head Movement to the Externalization and Internalization of Sounds [J].
Brimijoin, W. Owen ;
Boyd, Alan W. ;
Akeroyd, Michael A. .
PLOS ONE, 2013, 8 (12)
[9]  
Carlile S, 2016, TRENDS HEAR, V20, DOI [10.1177/2331216516614254, 10.1177/2331216516644254]
[10]  
Durlach N.I., 1992, Presence Teleoperators and Virtual Enviroments, Volume, V1, P251, DOI DOI 10.1162/PRES.1992.1.2.251