Human-Animal Affective Robot Touch Classification Using Deep Neural Network

被引:6
作者
Al-mashhadani, Mohammed Ibrahim Ahmed [1 ]
Aldhyani, Theyazn H. H. [2 ]
Al-Adhaileh, Mosleh Hmoud [3 ]
Bamhdi, Alwi M. [4 ]
Alzahrani, Mohammed Y. [5 ]
Alsaade, Fawaz Waselallah [6 ]
Alkahtani, Hasan [1 ,6 ]
机构
[1] AL Iraqia Univ, Coll Educ, Comp Dept, Baghdad, Iraq
[2] King Faisal Univ, Community Coll Abqaiq, Al Hasa, Saudi Arabia
[3] King Faisal Univ, Deanship E Learning & Distance Educ, Al Hasa, Saudi Arabia
[4] UMM Al Qura Univ, Coll Comp, Mecca, Saudi Arabia
[5] Albaha Univ, Dept Comp Sci & Informat Technol, Al Baha, Saudi Arabia
[6] King Faisal Univ, Coll Comp Sci & Informat Technol, Al Hasa, Saudi Arabia
来源
COMPUTER SYSTEMS SCIENCE AND ENGINEERING | 2021年 / 38卷 / 01期
关键词
Touch gesture recognition; touch gesture classification; deep learning; RECOGNITION; MODEL;
D O I
10.32604/csse.2021.014992
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Touch gesture recognition is an important aspect in human-robot interaction, as it makes such interaction effective and realistic. The novelty of this study is the development of a system that recognizes human-animal affective robot touch (HAART) using a deep learning algorithm. The proposed system was used for touch gesture recognition based on a dataset provided by the Recognition of the Touch Gestures Challenge 2015. The dataset was tested with numerous subjects performing different HAART gestures; each touch was performed on a robotic animal covered by a pressure sensor skin. A convolutional neural network algorithm is proposed to implement the touch recognition system from row inputs of the sensor devices. The leave-one-subject-out cross-validation method was used to validate and evaluate the proposed system. A comparative analysis between the results of the proposed system and the state-of-the-art performance is presented. Findings show that the proposed system could recognize the gestures in almost real time (after acquiring the minimum number of frames). According to the results of the leave-one-subject-out cross-validation method, the proposed algorithm could achieve a classification accuracy of 83.2%. It was also superior compared with existing systems in terms of classification ratio, touch recognition time, and data preprocessing on the same dataset. Therefore, the proposed system can be used in a wide range of real applications, such as image recognition, natural language recognition, and video clip classification.
引用
收藏
页码:25 / 37
页数:13
相关论文
共 49 条
[1]   RETRACTED: Water Quality Prediction Using Artificial Intelligence Algorithms (Retracted Article) [J].
Aldhyani, Theyazn H. H. ;
Al-Yaari, Mohammed ;
Alkahtani, Hasan ;
Maashi, Mashael .
APPLIED BIONICS AND BIOMECHANICS, 2020, 2020
[2]   Intelligent Hybrid Model to Enhance Time Series Models for Predicting Network Traffic [J].
Aldhyani, Theyazn H. H. ;
Alrasheedi, Melfi ;
Alqarni, Ahmed Abdullah ;
Alzahrani, Mohammed Y. ;
Bamhdi, Alwi M. .
IEEE ACCESS, 2020, 8 :130431-130451
[3]  
Aldhyani THH, 2020, J INF SCI ENG, V36, P365, DOI [10.6688/JISE.202003_36(2).0014, 10.6688/JISE.202003_36(6).0014]
[4]   Soft Clustering for Enhancing the Diagnosis of Chronic Diseases over Machine Learning Algorithms [J].
Aldhyani, Theyazn H. H. ;
Alshebami, Ali Saleh ;
Alzahrani, Mohammed Y. .
JOURNAL OF HEALTHCARE ENGINEERING, 2020, 2020
[5]   RETRACTED: Adaptive Anomaly Detection Framework Model Objects in Cyberspace (Retracted article. See vol. 2023, 2023) [J].
Alkahtani, Hasan ;
Aldhyani, Theyazn H. H. ;
Al-Yaari, Mohammed .
APPLIED BIONICS AND BIOMECHANICS, 2020, 2020
[6]   Recognizing Touch Gestures for Social Human-Robot Interaction [J].
Altuglu, Tugce Balli ;
Altun, Kerem .
ICMI'15: PROCEEDINGS OF THE 2015 ACM INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION, 2015, :407-413
[7]   Recognizing affect in human touch of a robot [J].
Altun, Kerem ;
MacLean, Karon E. .
PATTERN RECOGNITION LETTERS, 2015, 66 :31-40
[8]  
[Anonymous], 2009, P UIST US INT SOFTW
[9]  
Bailenson JN, 2007, HUM-COMPUT INTERACT, V22, P325
[10]   I show you how I like you -: Can you read it in my face? [J].
Cañamero, L ;
Fredslund, J .
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART A-SYSTEMS AND HUMANS, 2001, 31 (05) :454-459