Lighting control with Myo armband based on customized classifier

被引:0
作者
Jiang Y. [1 ,2 ,3 ]
Yang X. [1 ,2 ,3 ]
Zhang J. [1 ,2 ,3 ]
Song Y. [1 ,2 ,3 ]
机构
[1] Key Laboratory of Acoustic Visual Technology and Intelligent Control System, Communication University of China, Beijing
[2] Beijing Key Laboratory of Modern Entertainment Technology, Communication University of China, Beijing
[3] School of Information and Communication Engineering, Communication University of China, Beijing
来源
Journal of China Universities of Posts and Telecommunications | 2022年 / 29卷 / 04期
关键词
customized classifier; gesture recognition; lighting control; Myo armband; surface electromyography;
D O I
10.19682/j.cnki.1005-8885.2022.2023
中图分类号
学科分类号
摘要
This paper focuses on gesture recognition and interactive lighting control. The collection of gesture data adopts the Myo armband to obtain surface electromyography (sEMG). Considering that many factors affect sEMG, a customized classifier based on user calibration data is used for gesture recognition. In this paper, machine learning classifiers k鄄nearest neighbor (KNN), support vector machines (SVM), and naive Bayesian (NB) classifier, which can be used in small sample sets, are selected to classify four gesture actions. The performance of the three classifiers under different training parameters, different input features, including root mean square (RMS), mean absolute value (MAV), waveform length (WL), slope sign change (SSC) number, zero crossing (ZC) number, and variance (VAR) are tested, and different input channels are also tested. Experimental results show that: The NB classifier, which assumes that the prior probability of features is polynomial distribution, has the best performance, reaching more than 95% accuracy. Finally, an interactive stage lighting control system based on Myo armband gesture recognition is implemented. © 2022, Beijing University of Posts and Telecommunications. All rights reserved.
引用
收藏
页码:106 / 116
页数:10
相关论文
共 26 条
  • [1] B K 摇.C., Sarma D., Bhuyan M K., Et al., A review of constraints on vision鄄based gesture recognition for human鄄computer interaction, IET Computer Vision, 12, 1, pp. 3-15, (2017)
  • [2] S R 摇.W., Scott B C., Rubin D M., Et al., Development of a novel cardiopulmonary resuscitation measurement tool using real鄄 time feedback from wearable wireless instrumentation, Resuscitation, 137, pp. 183-189, (2019)
  • [3] Mulling T 摇., Sathiyanarayanan M., Characteristics of hand gesture navigation: A case study using a wearable device (MYO), Proceedings of the 2015 British HCI Conference, pp. 283-284, (2015)
  • [4] Bisi S 摇., DE LUCA L., Shrestha B., Et al., Development of an EMG鄄controlled mobile robot, Robotics, 7, 3, (2018)
  • [5] 摇 Hassan H.F., Abou鄄loukh S.J., Ibraheem I.K., Teleoperated robotic arm movement using electromyography signal with wearable Myo armband, Journal of King Saud University: Engineering Sciences, 32, 6, pp. 378-387, (2020)
  • [6] M A 摇.L., Romero P A., Quevedo W X., Et al., Virtual Rehabilitation System for Fine Motor Skills Using a Functional Hand Orthosis. Augmented Reality, Virtual Reality, and Computer Graphics: Proceedings of the 5Th International Conference on Augmented Reality, Virtual Reality and Computer Graphics, pp. 78-94, (2018)
  • [7] Zhang Z 摇., Su Z Y., Yang G., Real鄄time Chinese sign language recognition based on artificial neural networks, Proceedings of the 2019 IEEE International Conference on Robotics and Biomimetics, pp. 1413-1417, (2019)
  • [8] M F 摇.W., Tafreshi R A.M., Et al., Subject鄄 independent hand gesture recognition using normalization and machine learning algorithms, Journal of Computational Science, 27, pp. 69-76, (2018)
  • [9] R H C., Reaz M B I., Et al., Surface electromyography signal processing and classification techniques, Sensors, 13, 9, pp. 12431-12466, (2013)
  • [10] Pizzolato S., Tagliapietra L., Cognolato M., Et al., Comparison of six electromyography acquisition setsups on hand movement classification tasks, Plos ONE, 12, 10, (2017)