Human-Robot Interaction (HRI) through hand gestures for possible future war robots: A leap motion controller application

被引:0
作者
Sesli, Erhan [1 ]
机构
[1] Karadeniz Tech Univ, Technol Fac, Dept Elect & Telecommun Engn, TR-61080 Trabzon, Turkiye
关键词
Human-robot interaction; Cumulative distribution function; Deep neural network; Leap motion controller; Hand gesture recognition; DENSITY;
D O I
10.1007/s11042-023-15278-0
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this article, the futuristically possible human (commander)-robot (soldier) interaction (HRI) based on effective hand gesture recognition is discussed. As methodologically, Leap Motion Controller (LMC), which is frequently used in virtual reality applications, was used to obtain hand gesture features. Only the relevant distance of the fingers to each other and to the normal of the hand is considered as a feature and high performance is questioned under these constraints. Then performances of six hand gesture recognition methods, classified as light, medium weight, and complex, were examined with random dynamic movements and in different frame numbers. The performance of the proposed cumulative distribution function (CDF) based deep neural network (DNN) approach has achieved an accuracy of 88.44%. With this result, an improvement of 4.76% has been achieved compared to the second closest method, Kullback Leibler Divergence, by using the proposed method. Although limited features, high performance has been achieved. There is no mechanical or electronic robot design in the study; however, the computer used as the decision mechanism of the robot was modeled and made ready for application. In this sense, we believe wholeheartedly that in the future, this work can be a pioneer study in the military field.
引用
收藏
页码:36547 / 36570
页数:24
相关论文
共 51 条
  • [1] Convolutional Neural Networks for Speech Recognition
    Abdel-Hamid, Ossama
    Mohamed, Abdel-Rahman
    Jiang, Hui
    Deng, Li
    Penn, Gerald
    Yu, Dong
    [J]. IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2014, 22 (10) : 1533 - 1545
  • [2] A new similarity measure for collaborative filtering to alleviate the new user cold-starting problem
    Ahn, Hyung Jun
    [J]. INFORMATION SCIENCES, 2008, 178 (01) : 37 - 51
  • [3] [Anonymous], 2023, MILITARY HAND SIGNAL
  • [4] Glove-Based Hand Gesture Recognition for Diver Communication
    Antillon, Derek W. Orbaugh
    Walker, Christopher R.
    Rosset, Samuel
    Anderson, Iain A.
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (12) : 9874 - 9886
  • [5] DeepPicar: A Low-cost Deep Neural Network-based Autonomous Car
    Bechtel, Michael G.
    McEllhiney, Elise
    Kim, Minje
    Yun, Heechul
    [J]. 2018 IEEE 24TH INTERNATIONAL CONFERENCE ON EMBEDDED AND REAL-TIME COMPUTING SYSTEMS AND APPLICATIONS (RTCSA), 2018, : 11 - 21
  • [6] British Sign Language Recognition via Late Fusion of Computer Vision and Leap Motion with Transfer Learning to American Sign Language
    Bird, Jordan J.
    Ekart, Aniko
    Faria, Diego R.
    [J]. SENSORS, 2020, 20 (18) : 1 - 19
  • [7] Artificial Intelligence Applications in Military Systems and Their Influence on Sense of Security of Citizens
    Bistron, Marta
    Piotrowski, Zbigniew
    [J]. ELECTRONICS, 2021, 10 (07)
  • [8] Budiharto Widodo, 2020, ICIC Express Letters, V14, P83, DOI 10.24507/icicel.14.03.289
  • [9] Objective and automatic classification of Parkinson disease with Leap Motion controller
    Butt, A. H.
    Rovini, E.
    Dolciotti, C.
    De Petris, G.
    Bongioanni, P.
    Carboncini, M. C.
    Cavallo, F.
    [J]. BIOMEDICAL ENGINEERING ONLINE, 2018, 17
  • [10] Chen C, 2017, 2017 2ND INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION ENGINEERING (ICRAE), P48, DOI 10.1109/ICRAE.2017.8291351