Audiovisual integration in the human perception of materials

被引:38
作者
Fujisaki, Waka [1 ]
Goda, Naokazu [2 ]
Motoyoshi, Isamu [3 ]
Komatsu, Hidehiko [2 ]
Nishida, Shin'ya [3 ]
机构
[1] Natl Inst Adv Ind Sci & Technol, Human Technol Res Inst, Tsukuba, Ibaraki, Japan
[2] Natl Inst Physiol Sci, Div Sensory & Cognit Informat, Okazaki, Aichi 444, Japan
[3] NTT Corp, NTT Commun Sci Labs, Atsugi, Kanagawa, Japan
来源
JOURNAL OF VISION | 2014年 / 14卷 / 04期
关键词
material perception; audio-visual integration; Bayesian integration; surface texture; impact sound; VISUAL-PERCEPTION; SURFACE; REPRESENTATION; TEXTURE; COLOR; FMRI; FORM; INFORMATION; GLOSS;
D O I
10.1167/14.4.12
中图分类号
R77 [眼科学];
学科分类号
100212 ;
摘要
Interest in the perception of the material of objects has been growing. While material perception is a critical ability for animals to properly regulate behavioral interactions with surrounding objects (e.g., eating), little is known about its underlying processing. Vision and audition provide useful information for material perception; using only its visual appearance or impact sound, we can infer what an object is made from. However, what material is perceived when the visual appearance of one material is combined with the impact sound of another, and what are the rules that govern cross-modal integration of material information? We addressed these questions by asking 16 human participants to rate how likely it was that audiovisual stimuli (48 combinations of visual appearances of six materials and impact sounds of eight materials) along with visual-only stimuli and auditory-only stimuli fell into each of 13 material categories. The results indicated strong interactions between audiovisual material perceptions; for example, the appearance of glass paired with a pepper sound is perceived as transparent plastic. Rating materialcategory likelihoods follow amultiplicative integration rule in that the categories judged to be likely are consistent with both visual and auditory stimuli. On the other hand, rating-material properties, such as roughness and hardness, follow a weighted average rule. Despite a difference in their integration calculations, both rules can be interpreted as optimal Bayesian integration of independent audiovisual estimations for the two types of material judgment, respectively.
引用
收藏
页数:20
相关论文
共 50 条
  • [1] Automatic audiovisual integration in speech perception
    Maurizio Gentilucci
    Luigi Cattaneo
    Experimental Brain Research, 2005, 167 : 66 - 75
  • [2] Automatic audiovisual integration in speech perception
    Gentilucci, M
    Cattaneo, L
    EXPERIMENTAL BRAIN RESEARCH, 2005, 167 (01) : 66 - 75
  • [3] Perception based method for the investigation of audiovisual integration of speech
    Huhn, Zsofia
    Szirtes, Gabor
    Lorincz, Andras
    Csepe, Valeria
    NEUROSCIENCE LETTERS, 2009, 465 (03) : 204 - 209
  • [4] Causal Inference in Audiovisual Perception
    Mihalik, Agoston
    Noppeney, Uta
    JOURNAL OF NEUROSCIENCE, 2020, 40 (34) : 6600 - 6612
  • [5] Reduced audiovisual integration in synesthesia evidence from bimodal speech perception
    Sinke, Christopher
    Neufeld, Janina
    Zedler, Markus
    Emrich, Hinderk M.
    Bleich, Stefan
    Muente, Thomas F.
    Szycik, Gregor R.
    JOURNAL OF NEUROPSYCHOLOGY, 2014, 8 (01) : 94 - 106
  • [6] The early maximum likelihood estimation model of audiovisual integration in speech perception
    Andersen, Tobias S.
    JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA, 2015, 137 (05) : 2884 - 2891
  • [7] Audiovisual Integration in Social Evaluation
    Mileva, Mila
    Tompkinson, James
    Watt, Dominic
    Burton, A. Mike
    JOURNAL OF EXPERIMENTAL PSYCHOLOGY-HUMAN PERCEPTION AND PERFORMANCE, 2018, 44 (01) : 128 - 138
  • [8] Human brain activity associated with audiovisual perception and attention
    Degerman, Alexander
    Rinne, Teemu
    Pekkola, Johanna
    Autti, Taina
    Jaaskelainen, Iiro P.
    Sams, Mikko
    Alho, Kimmo
    NEUROIMAGE, 2007, 34 (04) : 1683 - 1691
  • [9] Causal inference and temporal predictions in audiovisual perception of speech and music
    Noppeney, Uta
    Lee, Hwee Ling
    ANNALS OF THE NEW YORK ACADEMY OF SCIENCES, 2018, 1423 (01) : 102 - 116
  • [10] Audiovisual integration in the human brain: a coordinate-based meta-analysis
    Gao, Chuanji
    Green, Jessica J.
    Yang, Xuan
    Oh, Sewon
    Kim, Jongwan
    Shinkareva, Svetlana, V
    CEREBRAL CORTEX, 2023, 33 (09) : 5574 - 5584