An Auditory-Tactile Visual Saccade-Independent P300 Brain-Computer Interface

被引:82
作者
Yin, Erwei [1 ,2 ]
Zeyl, Timothy [3 ]
Saab, Rami [4 ]
Hu, Dewen [1 ]
Zhou, Zongtan [1 ]
Chau, Tom [3 ]
机构
[1] Natl Univ Def Technol, Coll Mechatron Engn & Automat, Changsha 410073, Hunan, Peoples R China
[2] China Astronaut Res & Training Ctr, Natl Key Lab Human Factors Engn, Beijing 100094, Peoples R China
[3] Univ Toronto, Inst Biomat & Biomed Engn, Holland Bloorview Kids Rehabil Hosp, Bloorview Res Inst, Toronto, ON M4G 1R8, Canada
[4] McMaster Univ, Dept Elect & Comp Engn, Hamilton, ON L8S 4L8, Canada
基金
中国国家自然科学基金; 加拿大自然科学与工程研究理事会;
关键词
Brain computer interface; P300 event-related potentials; bimodal stimuli; auditory; tactile; LANGUAGE NETWORKS; BCI; ATTENTION; POTENTIALS; WHEELCHAIR; SPELLER; SOUND;
D O I
10.1142/S0129065716500015
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Most P300 event-related potential (ERP)-based brain-computer interface (BCI) studies focus on gaze shift-dependent BCIs, which cannot be used by people who have lost voluntary eye movement. However, the performance of visual saccade-independent P300 BCIs is generally poor. To improve saccade-independent BCI performance, we propose a bimodal P300 BCI approach that simultaneously employs auditory and tactile stimuli. The proposed P300 BCI is a vision-independent system because no visual interaction is required of the user. Specifically, we designed a direction-congruent bimodal paradigm by randomly and simultaneously presenting auditory and tactile stimuli from the same direction. Furthermore, the channels and number of trials were tailored to each user to improve online performance. With 12 participants, the average online information transfer rate (ITR) of the bimodal approach improved by 45.43% and 51.05% over that attained, respectively, with the auditory and tactile approaches individually. Importantly, the average online ITR of the bimodal approach, including the break time between selections, reached 10.77 bits/min. These findings suggest that the proposed bimodal system holds promise as a practical visual saccade-independent P300 BCI.
引用
收藏
页数:16
相关论文
共 78 条
  • [31] An efficient P300-based brain-computer interface for disabled subjects
    Hoffmann, Ulrich
    Vesin, Jean-Marc
    Ebrahimi, Touradj
    Diserens, Karin
    [J]. JOURNAL OF NEUROSCIENCE METHODS, 2008, 167 (01) : 115 - 125
  • [32] Temporal factors affecting somatosensory-auditory interactions in speech processing
    Ito, Takayuki
    Gracco, Vincent L.
    Ostry, David J.
    [J]. FRONTIERS IN PSYCHOLOGY, 2014, 5
  • [33] An exploratory study of factors affecting single trial P300 detection
    Jansen, BH
    Allam, A
    Kota, P
    Lachance, K
    Osho, A
    Sundaresan, K
    [J]. IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, 2004, 51 (06) : 975 - 978
  • [34] AN ERP-BASED BCI USING AN ODDBALL PARADIGM WITH DIFFERENT FACES AND REDUCED ERRORS IN CRITICAL FUNCTIONS
    Jin, Jing
    Allison, Brendan Z.
    Zhang, Yu
    Wang, Xingyu
    Cichocki, Andrzej
    [J]. INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 2014, 24 (08)
  • [35] Toward brain-computer interface based wheelchair control utilizing tactually-evoked event-related potentials
    Kaufmann, Tobias
    Herweg, Andreas
    Kuebler, Andrea
    [J]. JOURNAL OF NEUROENGINEERING AND REHABILITATION, 2014, 11
  • [36] Comparison of tactile, auditory, and visual modality for brain-computer interface use: a case study with a patient in the locked-in state
    Kaufmann, Tobias
    Holz, Elisa M.
    Kuebler, Andrea
    [J]. FRONTIERS IN NEUROSCIENCE, 2013, 7
  • [37] A Unified Probabilistic Approach to Improve Spelling in an Event-Related Potential-Based Brain-Computer Interface
    Kindermans, Pieter-Jan
    Verschore, Hannes
    Schrauwen, Benjamin
    [J]. IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, 2013, 60 (10) : 2696 - 2705
  • [38] Wearable Sensorimotor Enhancer for Fingertip Based on Stochastic Resonance Effect
    Kurita, Yuichi
    Shinohara, Minoru
    Ueda, Jun
    [J]. IEEE TRANSACTIONS ON HUMAN-MACHINE SYSTEMS, 2013, 43 (03) : 333 - 337
  • [39] Quadcopter control in three-dimensional space using a noninvasive motor imagery-based brain-computer interface
    LaFleur, Karl
    Cassady, Kaitlin
    Doud, Alexander
    Shades, Kaleb
    Rogin, Eitan
    He, Bin
    [J]. JOURNAL OF NEURAL ENGINEERING, 2013, 10 (04)
  • [40] Gaze independent brain-computer speller with covert visual search tasks
    Liu, Yang
    Zhou, Zongtan
    Hu, Dewen
    [J]. CLINICAL NEUROPHYSIOLOGY, 2011, 122 (06) : 1127 - 1136