The temporal dynamics of conscious and unconscious audio-visual semantic integration

被引:0
作者
Gao, Mingjie [1 ]
Zhu, Weina [1 ]
Drewes, Jan [2 ]
机构
[1] Yunnan Univ, Sch Informat Sci, Kunming 650091, Peoples R China
[2] Sichuan Normal Univ, Inst Brain & Psychol Sci, Chengdu, Peoples R China
基金
中国国家自然科学基金;
关键词
NATURALISTIC SOUNDS; OCULAR DOMINANCE; SPOKEN WORDS; TIME-COURSE; SPEECH; CORRESPONDENCES; IDENTIFICATION; PERCEPTION; COMPONENTS; SOFTWARE;
D O I
10.1016/j.heliyon.2024.e33828
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
We compared the time course of cross-modal semantic effects induced by both naturalistic sounds and spoken words on the processing of visual stimuli, whether visible or suppressed form awareness through continuous flash suppression. We found that, under visible conditions, spoken words elicited audio-visual semantic effects over longer time (-1000,-500,-250 ms SOAs) than naturalistic sounds (-500,-250 ms SOAs). Performance was generally better with auditory primes, but more so with congruent stimuli. Spoken words presented in advance (-1000,-500 ms) outperformed naturalistic sounds; the opposite was true for (near-)simultaneous presentations. Congruent spoken words demonstrated superior categorization performance compared to congruent naturalistic sounds. The audio-visual semantic congruency effect still occurred with suppressed visual stimuli, although without significant variations in the temporal patterns between auditory types. These findings indicate that: 1. Semantically congruent auditory input can enhance visual processing performance, even when the visual stimulus is imperceptible to conscious awareness. 2. The temporal dynamics is contingent on the auditory types only when the visual stimulus is visible. 3. Audiovisual semantic integration requires sufficient time for processing auditory information.
引用
收藏
页数:14
相关论文
共 50 条
[21]   Large Scale Functional Brain Networks Underlying Temporal Integration of Audio-Visual Speech Perception: An EEG Study [J].
Kumar, G. Vinodh ;
Halder, Tamesh ;
Jaiswal, Amit K. ;
Mukherjee, Abhishek ;
Roy, Dipanjan ;
Banerjee, Arpan .
FRONTIERS IN PSYCHOLOGY, 2016, 7
[22]   The audio-visual integration effect on music emotion: Behavioral and physiological evidence [J].
Pan, Fada ;
Zhang, Li ;
Ou, Yuhong ;
Zhang, Xinni .
PLOS ONE, 2019, 14 (05)
[23]   Audio-visual speech experience with age influences perceived audio-visual asynchrony in speech [J].
Alm, Magnus ;
Behne, Dawn .
JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA, 2013, 134 (04) :3001-3010
[24]   Audio-Visual Integration in Nonverbal or Minimally Verbal Young Autistic Children [J].
Kissine, Mikhail ;
Bertels, Julie ;
Deconinck, Nicolas ;
Passeri, Gianfranco ;
Deliens, Gaetane .
JOURNAL OF EXPERIMENTAL PSYCHOLOGY-GENERAL, 2021, 150 (10) :2137-2157
[25]   Multimodal Integration of Dynamic Audio-Visual Cues in the Communication of Agreement and Disagreement [J].
Mehu, Marc ;
van der Maaten, Laurens .
JOURNAL OF NONVERBAL BEHAVIOR, 2014, 38 (04) :569-597
[26]   The development of audio-visual temporal precision precedes its rapid recalibration [J].
Han, Shui'er ;
Chen, Yi-Chuan ;
Maurer, Daphne ;
Shore, David I. ;
Lewis, Terri L. ;
Stanley, Brendan M. ;
Alais, David .
SCIENTIFIC REPORTS, 2022, 12 (01)
[27]   Effects of Temporal Distribution on Utility of Temporal Factors in Competitive Audio-Visual Perceived Synchrony [J].
Wilbiks, Jonathan M. P. .
MULTISENSORY RESEARCH, 2018, 31 (06) :579-593
[28]   Atypical rapid audio-visual temporal recalibration in autism spectrum disorders [J].
Noel, Jean-Paul ;
De Niear, Matthew A. ;
Stevenson, Ryan ;
Alais, David ;
Wallace, Mark T. .
AUTISM RESEARCH, 2017, 10 (01) :121-129
[29]   Effects of temporal asynchrony and stimulus magnitude on competitive audio-visual binding [J].
Wilbiks, Jonathan M. P. ;
Dyson, Benjamin J. .
ATTENTION PERCEPTION & PSYCHOPHYSICS, 2013, 75 (08) :1883-1891
[30]   Grouping and Segregation of Sensory Events by Actions in Temporal Audio-Visual Recalibration [J].
Ikumi, Nara ;
Soto-Faraco, Salvador .
FRONTIERS IN INTEGRATIVE NEUROSCIENCE, 2017, 10