Multisensory causal inference is feature-specific, not object-based

被引:5
作者
Badde, Stephanie [1 ]
Landy, Michael S. [2 ,3 ]
Adams, Wendy J. [4 ]
机构
[1] Tufts Univ, Dept Psychol, 490 Boston Ave, Medford, MA 02155 USA
[2] NYU, Dept Psychol, 6 Washington Pl, New York, NY 10003 USA
[3] NYU, Ctr Neural Sci, 6 Washington Pl, New York, NY 10003 USA
[4] Univ Southampton, Dept Psychol, 44 Highfield Campus, Southampton SO17 1BJ, England
基金
美国国家卫生研究院;
关键词
visual-haptic; roughness; slant; cue integration; causal inference; CROSSMODAL BINDING; UNITY ASSUMPTION; CUE COMBINATION; SIGNALS; INTEGRATION; INFORMATION; PERCEPTION; PHYSIOLOGY; SPEECH;
D O I
10.1098/rstb.2022.0345
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
Multisensory integration depends on causal inference about the sensory signals. We tested whether implicit causal-inference judgements pertain to entire objects or focus on task-relevant object features. Participants in our study judged virtual visual, haptic and visual-haptic surfaces with respect to two features-slant and roughness-against an internal standard in a two-alternative forced-choice task. Modelling of participants' responses revealed that the degree to which their perceptual judgements were based on integrated visual-haptic information varied unsystematically across features. For example, a perceived mismatch between visual and haptic roughness would not deter the observer from integrating visual and haptic slant. These results indicate that participants based their perceptual judgements on a feature-specific selection of information, suggesting that multisensory causal inference proceeds not at the object level but at the level of single object features.This article is part of the theme issue 'Decision and control processes in multisensory perception'.
引用
收藏
页数:8
相关论文
共 34 条
[1]   Multisensory Processing in Review: from Physiology to Behaviour [J].
Alais, David ;
Newell, Fiona N. ;
Mamassian, Pascal .
SEEING AND PERCEIVING, 2010, 23 (01) :3-38
[2]  
Badde S., 2023, FIGSHARE, DOI [10.6084/m9.figshare.c.6729986, DOI 10.6084/M9.FIGSHARE.C.6729986]
[3]   Modality-specific attention attenuates visual-tactile integration and recalibration effects by reducing prior expectations of a common source for vision and touch [J].
Badde, Stephanie ;
Navarro, Karen T. ;
Landy, Michael S. .
COGNITION, 2020, 197
[4]   Bayesian integration of visual and auditory signals for spatial localization [J].
Battaglia, PW ;
Jacobs, RA ;
Aslin, RN .
JOURNAL OF THE OPTICAL SOCIETY OF AMERICA A-OPTICS IMAGE SCIENCE AND VISION, 2003, 20 (07) :1391-1397
[5]  
Calvert G., 2004, The handbook of multisensory processes
[6]   Assessing the Role of the 'Unity Assumption' on Multisensory Integration: A Review [J].
Chen, Yi-Chuan ;
Spence, Charles .
FRONTIERS IN PSYCHOLOGY, 2017, 8
[7]   Semantics and the multisensory brain: How meaning modulates processes of audio-visual integration [J].
Doehrmann, Oliver ;
Naumer, Marcus J. .
BRAIN RESEARCH, 2008, 1242 :136-150
[8]   Crossmodal binding of fear in voice and face [J].
Dolan, RJ ;
Morris, JS ;
de Gelder, B .
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2001, 98 (17) :10006-10010
[9]   Humans integrate visual and haptic information in a statistically optimal fashion [J].
Ernst, MO ;
Banks, MS .
NATURE, 2002, 415 (6870) :429-433
[10]   Merging the senses into a robust percept [J].
Ernst, MO ;
Bülthoff, HH .
TRENDS IN COGNITIVE SCIENCES, 2004, 8 (04) :162-169