Enhancing learning outcomes through multisensory integration: A fMRI study of audio-visual training in virtual reality

被引:11
作者
Alwashmi, Kholoud [1 ,4 ,6 ]
Meyer, Georg [2 ]
Rowe, Fiona [3 ]
Ward, Ryan [2 ,5 ]
机构
[1] Univ Liverpool, Fac Hlth & Life Sci, Liverpool, England
[2] Univ Liverpool, Digital Innovat Facil, Liverpool, England
[3] Univ Liverpool, Inst Populat Hlth, Liverpool, England
[4] Princess Nourah bint Abdulrahman Univ, Dept Radiol, Riyadh, Saudi Arabia
[5] Liverpool John Moores Univ, Sch Comp Sci & Math, Liverpool, England
[6] Univ Liverpool, Eleanor Rathbone Bldg,Bedford St South, Liverpool L69 7ZA, England
关键词
fMRI; Multisensory; Audio-visual; Learning; Virtual; -reality; Eye; -movement; INFERIOR PARIETAL CORTEX; VISUAL-MOTION SIGNALS; SPATIAL ATTENTION; NEURAL RESPONSES; SPEECH SOUNDS; TIME-COURSE; PERFORMANCE; ACTIVATION; PLASTICITY; STIMULI;
D O I
10.1016/j.neuroimage.2023.120483
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
The integration of information from different sensory modalities is a fundamental process that enhances perception and performance in real and virtual environments (VR). Understanding these mechanisms, especially during learning tasks that exploit novel multisensory cue combinations provides opportunities for the development of new rehabilitative interventions.This study aimed to investigate how functional brain changes support behavioural performance improvements during an audio-visual (AV) learning task. Twenty healthy participants underwent a 30 min daily VR training for four weeks. The task was an AV adaptation of a 'scanning training' paradigm that is commonly used in hemianopia rehabilitation. Functional magnetic resonance imaging (fMRI) and performance data were collected at baseline, after two and four weeks of training, and four weeks post-training. We show that behavioural performance, operationalised as mean reaction time reduction in VR, significantly improves. In separate tests in a controlled laboratory environment, we showed that the behavioural performance gains in the VR training environment transferred to a significant mean RT reduction for the trained AV voluntary task on a computer screen. Enhancements were observed in both the visual-only and AV conditions, with the latter demonstrating a faster response time supported by the presence of audio cues. The behavioural learning effect also transfers to two additional tasks that were tested: a visual search task and an involuntary visual task. Our fMRI results reveal an increase in functional activation (BOLD signal) in multisensory brain regions involved in early-stage AV processing: the thalamus, the caudal inferior parietal lobe and cerebellum. These functional changes were only observed for the trained, multisensory, task and not for unimodal visual stimulation. Functional activation changes in the thalamus were significantly correlated to behavioural performance improvements.This study demonstrates that incorporating spatial auditory cues to voluntary visual training in VR leads to augmented brain activation changes in multisensory integration, resulting in measurable performance gains across tasks. The findings highlight the potential of VR-based multisensory training as an effective method for enhancing cognitive function and as a potentially valuable tool in rehabilitative programmes.
引用
收藏
页数:18
相关论文
共 128 条
[1]  
Allue M., 2016, P 26 SPAN COMP GRAPH, P1
[2]   Behavioural performance improvement in visuomotor learning correlates with functional and microstructural brain changes [J].
Aloufi, A. E. ;
Rowe, F. J. ;
Meyer, G. F. .
NEUROIMAGE, 2021, 227
[3]   Learning to Associate Auditory and Visual Stimuli: Behavioral and Neural Mechanisms [J].
Altieri, Nicholas ;
Stevenson, Ryan A. ;
Wallace, Mark T. ;
Wenger, Michael J. .
BRAIN TOPOGRAPHY, 2015, 28 (03) :479-493
[4]   Audio-visual stimulation for visual compensatory functions in stroke survivors with visual field defect: a systematic review [J].
Alwashmi, Kholoud ;
Meyer, Georg ;
Rowe, Fiona J. .
NEUROLOGICAL SCIENCES, 2022, 43 (04) :2299-2321
[5]  
[Anonymous], 2023, SPM 12 "statistical parametric mapping
[6]  
[Anonymous], 2002, P 8 INT C FUNCTIONAL
[7]  
[Anonymous], 2023, Meta Quest 2 "Immersive All-In-One VR Headset
[8]   Learning curve models and applications: Literature review and research directions [J].
Anzanello, Michel Jose ;
Fogliatto, Flavio Sanson .
INTERNATIONAL JOURNAL OF INDUSTRIAL ERGONOMICS, 2011, 41 (05) :573-583
[9]   Cross-modal processing in early visual and auditory cortices depends on expected statistical relationship of multisensory information [J].
Baier, Bernhard ;
Kleinschmidt, Andreas ;
Mueller, Notger G. .
JOURNAL OF NEUROSCIENCE, 2006, 26 (47) :12260-12265
[10]   Consensus Paper: The Role of the Cerebellum in Perceptual Processes [J].
Baumann, Oliver ;
Borra, Ronald J. ;
Bower, James M. ;
Cullen, Kathleen E. ;
Habas, Christophe ;
Ivry, Richard B. ;
Leggio, Maria ;
Mattingley, Jason B. ;
Molinari, Marco ;
Moulton, Eric A. ;
Paulin, Michael G. ;
Pavlova, Marina A. ;
Schmahmann, Jeremy D. ;
Sokolov, Arseny A. .
CEREBELLUM, 2015, 14 (02) :197-220