Supporting Human-Robot Interaction by Projected Augmented Reality and a Brain Interface

被引:1
作者
De Pace, Francesco [1 ]
Manuri, Federico [2 ]
Bosco, Matteo [1 ]
Sanna, Andrea [2 ]
Kaufmann, Hannes [1 ]
机构
[1] TU Wien, Virtual & Augmented Real Grp, A-1040 Vienna, Austria
[2] Politecn Torino, Dept Control & Comp Engn, I-10129 Turin, Italy
关键词
Assistive robotics; augmented reality (AR); brain interface; NextMind; severe motor impairment; steady-state visual evoked potential (SSVEP);
D O I
10.1109/THMS.2024.3414208
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This article presents a brain-computer interface (BCI) coupled with an augmented reality (AR) system to support human-robot interaction in controlling a robotic arm for pick-and-place tasks. BCIs can process steady-state visual evoked potentials (SSVEPs), which are signals generated through visual stimuli. The visual stimuli may be conveyed to the user with AR systems, expanding the range of possible applications. The proposed approach leverages the capabilities of the NextMind BCI to enable users to select objects in the range of the robotic arm. By displaying a visual anchor associated with each object in the scene with projected AR, the NextMind device can detect when users focus their eyesight on one of them, thus triggering the pick-up action of the robotic arm. The proposed system has been designed considering the needs and limitations of mobility-impaired people to support them when controlling a robotic arm for pick-and-place tasks. Two different approaches for positioning the visual anchors are proposed and analyzed. Experimental tests involving users show that both approaches are highly appreciated. The system performances are extremely robust, thus allowing the users to select objects in an easy, fast, and reliable way.
引用
收藏
页码:599 / 608
页数:10
相关论文
共 61 条
  • [1] Passive Brain-Computer Interfaces for Enhanced Human-Robot Interaction
    Alimardani, Maryam
    Hiraki, Kazuo
    [J]. FRONTIERS IN ROBOTICS AND AI, 2020, 7
  • [2] Wearable BrainComputer Interfaces Based on Steady-State Visually Evoked Potentials and Augmented Reality: A Review
    Angrisani, Leopoldo
    Arpaia, Pasquale
    De Benedetto, Egidio
    Duraccio, Luigi
    Lo Regio, Fabrizio
    Tedesco, Annarita
    [J]. IEEE SENSORS JOURNAL, 2023, 23 (15) : 16501 - 16514
  • [3] [Anonymous], Unity3D
  • [4] assistive.kinovarobotics, KINOVA JACO
  • [5] A survey of augmented reality
    Azuma, RT
    [J]. PRESENCE-VIRTUAL AND AUGMENTED REALITY, 1997, 6 (04): : 355 - 385
  • [6] Survey on Brain-Computer Interface: An Emerging Computational Intelligence Paradigm
    Bablani, Annushree
    Edla, Damodar Reddy
    Tripath, Diwakar
    Cheruku, Ramalingaswamy
    [J]. ACM COMPUTING SURVEYS, 2019, 52 (01)
  • [7] An exoskeleton controlled by an epidural wireless brain-machine interface in a tetraplegic patient: a proof-of-concept demonstration
    Benabid, Alim Louis
    Costecalde, Thomas
    Eliseyev, Andrey
    Charvet, Guillaume
    Verney, Alexandre
    Karakas, Serpil
    Foerster, Michael
    Lambert, Aurelien
    Moriniere, Boris
    Abroug, Neil
    Schaeffer, Marie-Caroline
    Moly, Alexandre
    Sauter-Starace, Fabien
    Ratel, David
    Moro, Cecile
    Torres-Martinez, Napoleon
    Langar, Lilia
    Oddoux, Manuela
    Polosan, Mircea
    Pezzani, Stephane
    Auboiroux, Vincent
    Aksenova, Tetiana
    Mestais, Corinne
    Chabardes, Stephan
    [J]. LANCET NEUROLOGY, 2019, 18 (12) : 1112 - 1122
  • [8] Beverina F., 2003, PsychNol. J, V1, P331
  • [9] Optimizing spatial filters for robust EEG single-trial analysis
    Blankertz, Benjamin
    Tomioka, Ryota
    Lemm, Steven
    Kawanabe, Motoaki
    Mueller, Klaus-Robert
    [J]. IEEE SIGNAL PROCESSING MAGAZINE, 2008, 25 (01) : 41 - 56
  • [10] Brooke J., 1996, SUSQUICK DIRTYUSABIL