Channel capacity in brain-computer interfaces

被引:10
|
作者
da Silva Costa, Thiago Bulhoes [1 ,7 ]
Suarez Uribe, Luisa Fernanda [1 ,7 ]
de Carvalho, Sarah Negreiros [2 ,7 ]
Soriano, Diogo Coutinho [3 ,7 ]
Castellano, Gabriela [4 ,7 ]
Suyama, Ricardo [5 ,7 ]
Attux, Romis [1 ,7 ]
Panazio, Cristiano [6 ]
机构
[1] Univ Campinas UNICAMP, FEEC, Campinas, SP, Brazil
[2] Fed Univ Ouro Preto UFOP, ICEA, Joao Monlevade, MG, Brazil
[3] Fed Univ ABC UFABC, CECS, Sao Bernardo Do Campo, SP, Brazil
[4] Univ Campinas UNICAMP, IFGW, Campinas, SP, Brazil
[5] Fed Univ ABC UFABC, CECS, Santo Andre, SP, Brazil
[6] Univ Sao Paulo, Poli USP, Sao Paulo, SP, Brazil
[7] Brazilian Inst Neurosci & Neurotechnol BRAINN, Campinas, SP, Brazil
基金
巴西圣保罗研究基金会;
关键词
brain-computer interface; information transfer rate; channel capacity;
D O I
10.1088/1741-2552/ab6cb7
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Objective. Adapted from the concept of channel capacity, the information transfer rate (ITR) has been widely used to evaluate the performance of a brain-computer interface (BCI). However, its traditional formula considers the model of a discrete memoryless channel in which the transition matrix presents very particular symmetries. As an alternative to compute the ITR, this work indicates a more general closed-form expression-also based on that channel model, but with less restrictive assumptions-and, with the aid of a selection heuristic based on a wrapper algorithm, extends such formula to detect classes that deteriorate the operation of a BCI system. Approach. The benchmark is a steady-state visually evoked potential (SSVEP)-based BCI dataset with 40 frequencies/classes, in which two scenarios are tested: (1) our proposed formula is used and the classes are gradually evaluated in the order of the class labels provided with the dataset; and (2) the same formula is used but with the classes evaluated progressively by a wrapper algorithm. In both scenarios, the canonical correlation analysis (CCA) is the tool to detect SSVEPs. Main results. Before and after class selection using this alternative ITR, the average capacity among all subjects goes from 3.71 +/- 1.68 to 4.79 +/- 0.70 bits per symbol, with p-value<0.01, and, for a supposedly BCI-illiterate subject, her/his capacity goes from 1.53 to 3.90 bits per symbol. Significance. Besides indicating a consistent formula to compute ITR, this work provides an efficient method to perform channel assessment in the context of a BCI experiment and argues that such method can be used to study BCI illiteracy.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] Brain-computer interfaces
    Sajda, Paul
    Mueller, Klaus-Robert
    Shenoy, Krishna V.
    IEEE SIGNAL PROCESSING MAGAZINE, 2008, 25 (01) : 16 - 17
  • [2] Channel Reduction Approaches in Deep Learning for Brain-Computer Interfaces
    Madsen, Soren
    Goh, Justin
    Wood, Sally
    FIFTY-SEVENTH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS, IEEECONF, 2023, : 1491 - 1495
  • [3] Brain-computer interfaces: a review
    Coyle, S
    Ward, T
    Markham, C
    INTERDISCIPLINARY SCIENCE REVIEWS, 2003, 28 (02) : 112 - 118
  • [4] Brain-Computer Interfaces in Medicine
    Shih, Jerry J.
    Krusienski, Dean J.
    Wolpaw, Jonathan R.
    MAYO CLINIC PROCEEDINGS, 2012, 87 (03) : 268 - 279
  • [5] Flexible brain-computer interfaces
    Tang, Xin
    Shen, Hao
    Zhao, Siyuan
    Li, Na
    Liu, Jia
    NATURE ELECTRONICS, 2023, 6 (02) : 109 - 118
  • [6] Multimodal Brain-Computer Interfaces
    Alexander Maye
    Andreas K.Engel
    Tsinghua Science and Technology, 2011, 16 (02) : 133 - 139
  • [7] An update for brain-computer interfaces
    不详
    NATURE ELECTRONICS, 2024, 7 (09): : 725 - 725
  • [8] Brain-computer interfaces (BCIs)
    Berger, Theodore W.
    JOURNAL OF NEUROSCIENCE METHODS, 2008, 167 (01) : 1 - 1
  • [9] Optogenetic Brain-Computer Interfaces
    Tang, Feifang
    Yan, Feiyang
    Zhong, Yushan
    Li, Jinqian
    Gong, Hui
    Li, Xiangning
    BIOENGINEERING-BASEL, 2024, 11 (08):
  • [10] The business of brain-computer interfaces
    Smalley, Eric
    NATURE BIOTECHNOLOGY, 2019, 37 (09) : 978 - 982