Understanding multi-modal brain network data: An immersive 3D visualization approach

被引:1
|
作者
Pester B. [1 ]
Russig B. [1 ]
Winke O. [1 ]
Ligges C. [2 ]
Dachselt R. [3 ,4 ,5 ]
Gumhold S. [1 ,4 ,5 ]
机构
[1] Chair for Computer Graphics and Visualization, TU Dresden
[2] Jena University Hospital, Department of Child and Adolescent Psychiatry, Psychosomatic Medicine and Psychotherapy, Friedrich Schiller University Jena
[3] Interactive Media Lab Dresden, TU Dresden
[4] Centre for Tactile Internet with Human-in-the-Loop (CeTI), TU Dresden
[5] Cluster of Excellence Physics of Life, TU Dresden (PoL)
来源
Computers and Graphics (Pergamon) | 2022年 / 106卷
关键词
EEG brain connectivity; Immersive virtual reality; Origin–destination visualization; Partial directed coherence;
D O I
10.1016/j.cag.2022.05.024
中图分类号
学科分类号
摘要
Understanding the human brain requires the incorporation of functional interaction patterns that depend on a variety of features like experimental setup, strength of directed connectedness or variability between several individuals or groups. In addition to these external factors, there are internal properties of the brain network as for example temporal propagation of connections, or connectivity patterns that only occur in a distinct frequency range of the signal. The visualization of detected networks covering all necessary information poses a substantial problem which is mainly due to the high number of features that have to be integrated within the same view in a natural spatial context. To address this problem, we propose a new tool that transfers the network into an anatomically arranged origin–destination view in a virtual visual analysis lab. This offers the user an opportunity to assess the temporal evolution of connectivity patterns and provides an intuitive and motivating way of exploring the corresponding features via navigation and interaction in virtual reality (VR). The approach was evaluated in a user study including participants with neuroscientific background as well as people working in the field of computer science. As a first proof of concept trial we used functional brain networks derived from time series of electroencephalography recordings evoked by visual stimuli. All participants gave a positive general feedback, notably they saw a benefit in using the VR view instead of the compared 2D desktop variant. This suggests that our application successfully fills a gap in the visualization of high-dimensional brain networks and that it is worthwhile to further follow and enhance the proposed representation method. © 2022 Elsevier Ltd
引用
收藏
页码:88 / 97
页数:9
相关论文
共 50 条
  • [1] TAMM: TriAdapter Multi-Modal Learning for 3D Shape Understanding
    Zhang, Zhihao
    Cao, Shengcao
    Wang, Yu-Xiong
    2024 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2024, : 21413 - 21423
  • [2] A 3D multi-modal and multi-dimensional digital brain model as a framework for data sharing
    Mailly, Philippe
    Haber, Suzanne N.
    Groenewegen, Henk J.
    Deniau, Jean-Michel
    JOURNAL OF NEUROSCIENCE METHODS, 2010, 194 (01) : 56 - 63
  • [3] Immersive 3D Visualization of Astronomical Data
    Schaaff, A.
    Berthier, J.
    Da Rocha, J.
    Deparis, N.
    Derriere, S.
    Gaultier, P.
    Houpin, R.
    Normand, J.
    Ocvirk, P.
    ASTRONOMICAL DATA ANALYSIS SOFTWARE AND SYSTEMS: XXIV, 2015, 495 : 125 - 128
  • [4] Evaluation of 3D Feature Descriptors for Multi-modal Data Registration
    Kim, Hansung
    Hilton, Adrian
    2013 INTERNATIONAL CONFERENCE ON 3D VISION (3DV 2013), 2013, : 119 - 126
  • [5] 3DMIT: 3D Multi-modal Instruction Tuning for Scene Understanding
    Li, Zeju
    Zhang, Chao
    Wang, Xiaoyan
    Ren, Ruilong
    Xu, Yifan
    Ma, Ruifei
    Liu, Xiangde
    Wei, Rong
    2024 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO WORKSHOPS, ICMEW 2024, 2024,
  • [6] ViBe (Virtual Berlin) - Immersive Interactive 3D Urban Data Visualization Immersive interactive 3D urban data visualization
    Al Bondakji, Louna
    Lammich, Anne-Liese
    Werner, Liss C.
    ECAADE SIGRADI 2019: ARCHITECTURE IN THE AGE OF THE 4TH INDUSTRIAL REVOLUTION, VOLUME 3, 2019, : 83 - 90
  • [7] Understanding Fun in Learning to Code: A Multi-Modal Data approach
    Tisza, Gabriella
    Sharma, Kshitij
    Papavlasopoulou, Sofia
    Markopoulos, Panos
    Giannakos, Michail
    PROCEEDINGS OF THE 2022 ACM INTERACTION DESIGN AND CHILDREN, IDC 2022, 2022, : 274 - 287
  • [8] Multi-Modal Segmentation of 3D Brain Scans Using Neural Networks
    Zopes, Jonathan
    Platscher, Moritz
    Paganucci, Silvio
    Federau, Christian
    FRONTIERS IN NEUROLOGY, 2021, 12
  • [9] A digital 3D atlas of the marmoset brain based on multi-modal MRI
    Liu, Cirong
    Ye, Frank Q.
    Yen, Cecil Chern-Chyi
    Newman, John D.
    Glen, Daniel
    Leopold, David A.
    Silva, Afonso C.
    NEUROIMAGE, 2018, 169 : 106 - 116
  • [10] FuseNet: a multi-modal feature fusion network for 3D shape classification
    Zhao, Xin
    Chen, Yinhuang
    Yang, Chengzhuan
    Fang, Lincong
    VISUAL COMPUTER, 2025, 41 (04): : 2973 - 2985