A gesture recognition algorithm in a robot therapy for ASD children

被引:16
作者
Ivani, Alessia Silvia [1 ]
Giubergia, Alice [1 ]
Santos, Laura [1 ,2 ]
Geminiani, Alice [1 ,3 ]
Annunziata, Silvia [4 ]
Caglio, Arianna [4 ]
Olivieri, Ivana [4 ]
Pedrocchi, Alessandra [1 ]
机构
[1] Politecn Milan, NEAR Lab, Dept Elect Informat & Bioengn, Via Ponzio 34, I-20133 Milan, Italy
[2] Inst Super Tecn, Inst Syst & Robot, Ave Rovisco Pais 1, P-1049001 Lisbon, Portugal
[3] Univ Pavia, Dept Brain & Behav Sci, Via Forlanini 6, I-27100 Pavia, Italy
[4] Fdn Don Carlo Gnocchi, IRCCS, Via Alfonso Capecelatro 66, I-20148 Milan, Italy
关键词
Gesture recognition; Artificial neural networks classification; ASD; Human robot interaction; Real -time classification; ADAPTIVE HISTOGRAM EQUALIZATION; SYSTEM;
D O I
10.1016/j.bspc.2022.103512
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Children with Autism Spectrum Disorders (ASDs) exhibit significant impairments in gesture imitation. Newest interventions are based on Human-Robot Interaction (HRI) since children with ASD cope well with stylized, rulebased and predictable systems. These collaborative approaches encompass therapy games based on joint exercises, imitation and interaction between robots and children. This paper's aim was to implement an algorithm to automatically recognize small and similar gestures within a humanoid-robot therapy called IOGIOCO for ASD children. IOGIOCO is a multi-level HRI therapy meant to teach 19 meaningful gestures in a semantic framework based on a feedback interaction. Gestures were tracked as 3D coordinates of body keypoints captured by a Kinect. A Residual Neural Network was implemented and trained on a segmented Dataset acquired within this study to generate the offline model which was then exploited in a real-time classification using a sliding window. Feedback as sound stimuli from NAO robot was provided based on the automatic evaluation of each performance. Clinical acquisitions were carried out on 4 ASD children within the IOGIOCO therapy. Offline recognition was successful: exploiting Artificial Neural Networks we reached 95% of test accuracy for 19 gestures. A realtime recognition on healthy subjects reached 94% accuracy. Clinical applications were evaluated through the F1 score that achieved 79% value. These outcomes were encouraging considering the wide gesture set and all the challenges the therapy raises. This kind of automatic algorithm was able to decrease the therapist workload and increase the robustness of the therapy and engagement of the child.
引用
收藏
页数:12
相关论文
共 42 条
  • [1] Abraham A., 2017, ARTI CIAL NEURAL NET, DOI [10.1002/0471497398.mm421, DOI 10.1002/0471497398.MM421]
  • [2] A survey on deep learning based approaches for action and gesture recognition in image sequences
    Asadi-Aghbolaghi, Maryam
    Clapes, Albert
    Bellantonio, Marco
    Escalante, Hugo Jair
    Ponce-Lopez, Victor
    Baro, Xavier
    Guyon, Isabelle
    Kasaei, Shohreh
    Escalera, Sergio
    [J]. 2017 12TH IEEE INTERNATIONAL CONFERENCE ON AUTOMATIC FACE AND GESTURE RECOGNITION (FG 2017), 2017, : 476 - 483
  • [3] Brandolt F, 2020, IEEE LAT AM T, V18, P319, DOI [10.1109/tla.2020.9085286, 10.1109/TLA.2019.9082244]
  • [4] Brownlee J., 2019, Mach Learn Mastery
  • [5] Robot-Enhanced Therapy Developing and Validating a Supervised Autonomous Robotic System for Autism Spectrum Disorders Therapy
    Cao, Hoang-Long
    Esteban, Pablo G.
    Bartlett, Madeleine
    Baxter, Paul
    Belpaeme, Tony
    Billing, Erik
    Cai, Haibin
    Coeckelbergh, Mark
    Costescu, Cristina
    David, Daniel
    De Beir, Albert
    Hernandez, Daniel
    Kennedy, James
    Liu, Honghai
    Matu, Silviu
    Mazel, Alexandre
    Pandey, Amit
    Richardson, Kathleen
    Senft, Emmanuel
    Thill, Serge
    Van de Perre, Greet
    Vanderborght, Bram
    Vernon, David
    Wakunuma, Kutoma
    Yu, Hui
    Zhou, Xiaolong
    Ziemke, Tom
    [J]. IEEE ROBOTICS & AUTOMATION MAGAZINE, 2019, 26 (02) : 49 - 58
  • [6] Dorazio T., 2014, ICPRAM
  • [7] Du Y, 2015, PROCEEDINGS 3RD IAPR ASIAN CONFERENCE ON PATTERN RECOGNITION ACPR 2015, P579, DOI 10.1109/ACPR.2015.7486569
  • [8] Duarte C., 2014, C HUM FACT COMP SYST, DOI [10.1145/2559206.2581337, DOI 10.1145/2559206.2581337]
  • [9] Perspective and Evolution of Gesture Recognition for Sign Language: A Review
    Galvan-Ruiz, Jesus
    Travieso-Gonzalez, Carlos M.
    Tejera-Fettmilch, Acaymo
    Pinan-Roescher, Alejandro
    Esteban-Hernandez, Luis
    Dominguez-Quintana, Luis
    [J]. SENSORS, 2020, 20 (12) : 1 - 31
  • [10] Geminiani A, 2019, IEEE ENG MED BIO, P1641, DOI [10.1109/embc.2019.8857576, 10.1109/EMBC.2019.8857576]