A Multipurpose Human–Machine Interface via 3D-Printed Pressure-Based Force Myography

被引:7
作者
Zhou, Hao [1 ,2 ]
Tawk, Charbel [3 ]
Alici, Gursel [1 ,2 ]
机构
[1] Univ Wollongong, Sch Mech Mat Mechatron & Biomed Engn, Appl Mechatron & Biomed Engn Res AMBER Grp, Wollongong, NSW 2522, Australia
[2] Univ Wollongong, Fac Engn & Informat Sci, Wollongong, NSW 2522, Australia
[3] Lebanese Amer Univ, Dept Ind & Mech Engn, Byblos, Lebanon
关键词
Artificial intelligence (AI); force myography; gesture recognition; human-machine interface (HMI); soft robotics; wearable sensors; GESTURE RECOGNITION; SURFACE-ELECTROMYOGRAPHY;
D O I
10.1109/TII.2024.3375376
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Artificially intelligent (AI), powerful, and reliable human-machine interfaces (HMIs) are highly desired for wearable technologies, which proved to be the next advancement when it comes to humans interacting with physical, digital, and mixed environments. To demonstrate them, here we report on an innovative noninvasive, lightweight, low-cost, wearable, and soft pressure-based force myography (pFMG) HMI in the form of an armband. The armband acquires stable mechanical biosignals in the form of air pressure information in response to forces induced by muscle activity consisting of contraction and relaxation that deform its pressure-sensitive chambers (PSCs). The PSCs are characterized by a fast response to a mechanical biosignal, negligible hysteresis, repeatability, reproducibility, reliability, stability, minimal calibration requirements, and durability (more than 1 500 000 cycles). The pFMG armband is resistant to sweat, body hair present on the skin, worn cloth, and scars, and resilient to external mechanical deformations. We demonstrate the capability and versatility of the pFMG-based HMI armband to interact with and control collaborative robot manipulators, robotic prosthetic hands, drones, computer games, and any system where humans are in the loop. The control signals are generated through the implementation of a machine learning algorithm to decode and classify the acquired biosignals of different hand gestures to rapidly and accurately recognize the intentions of a user. The easy and direct fabrication and customization of the armband in addition to its ability to decode any desired gesture rapidly and reliably based on stable and reliable biosignals makes it ideal to be integrated into AI-powered HMI applications.
引用
收藏
页码:8838 / 8849
页数:12
相关论文
共 43 条
  • [1] Toward Intuitive Prosthetic Control Solving Common Issues Using Force Myography, Surface Electromyography, and Pattern Recognition in a Pilot Case Study
    Ahmadizadeh, Chakaveh
    Merhi, Lukas-Karim
    Pousett, Brittany
    Sangha, Sohail
    Menon, Carlo
    [J]. IEEE ROBOTICS & AUTOMATION MAGAZINE, 2017, 24 (04) : 102 - 111
  • [2] Progress and prospects of the human-robot collaboration
    Ajoudani, Arash
    Zanchettin, Andrea Maria
    Ivaldi, Serena
    Albu-Schaeffer, Alin
    Kosuge, Kazuhiro
    Khatib, Oussama
    [J]. AUTONOMOUS ROBOTS, 2018, 42 (05) : 957 - 975
  • [3] Return of cybernetics
    不详
    [J]. NATURE MACHINE INTELLIGENCE, 2019, 1 (09) : 385 - 385
  • [4] FMG- and RNN-Based Estimation of Motor Intention of Upper-Limb Motion in Human-Robot Collaboration
    Anvaripour, Mohammad
    Khoshnam, Mahta
    Menon, Carlo
    Saif, Mehrdad
    [J]. FRONTIERS IN ROBOTICS AND AI, 2020, 7
  • [5] Gesture Recognition Using Markov Systems and Wearable Wireless Inertial Sensors
    Arsenault, Dennis
    Whitehead, Anthony D.
    [J]. IEEE TRANSACTIONS ON CONSUMER ELECTRONICS, 2015, 61 (04) : 429 - 437
  • [6] Electromyography data for non-invasive naturally-controlled robotic hand prostheses
    Atzori, Manfredo
    Gijsberts, Arjan
    Castellini, Claudio
    Caputo, Barbara
    Hager, Anne-Gabrielle Mittaz
    Elsig, Simone
    Giatsidis, Giorgio
    Bassetto, Franco
    Muller, Henning
    [J]. SCIENTIFIC DATA, 2014, 1
  • [7] Mechanical design and performance specifications of anthropomorphic prosthetic hands: A review
    Belter, Joseph T.
    Segil, Jacob L.
    Dollar, Aaron M.
    Weir, Richard F.
    [J]. JOURNAL OF REHABILITATION RESEARCH AND DEVELOPMENT, 2013, 50 (05) : 599 - 617
  • [8] Time is of the Essence: A Review of Electroencephalography (EEG) and Event-Related Brain Potentials (ERPs) in Language Research
    Beres, Anna M.
    [J]. APPLIED PSYCHOPHYSIOLOGY AND BIOFEEDBACK, 2017, 42 (04) : 247 - 255
  • [9] Hand-Gesture Recognition Based on EMG and Event-Based Camera Sensor Fusion: A Benchmark in Neuromorphic Computing
    Ceolini, Enea
    Frenkel, Charlotte
    Shrestha, Sumit Bam
    Taverni, Gemma
    Khacef, Lyes
    Payvand, Melika
    Donati, Elisa
    [J]. FRONTIERS IN NEUROSCIENCE, 2020, 14
  • [10] Assessment of a Wearable Force- and Electromyography Device and Comparison of the Related Signals for Myocontrol
    Connan, Mathilde
    Ramirez, Eduardo Ruiz
    Vodermayer, Bernhard
    Castellini, Claudio
    [J]. FRONTIERS IN NEUROROBOTICS, 2016, 10