Diver-robot communication dataset for underwater hand gesture recognition

被引:1
作者
Kvasic, Igor [1 ]
Antillon, Derek Orbaugh [2 ]
Nad, Dula [1 ]
Walker, Christopher [2 ]
Anderson, Iain [2 ]
Miskovic, Nikola [1 ]
机构
[1] Univ Zagreb, Fac Elect Engn & Comp, Lab Underwater Syst & Technol, Miramarska 20, Zagreb 10000, Croatia
[2] Univ Auckland, Auckland Bioengn Inst, Biomimet Lab, 6-70 Symonds St, Auckland 1010, New Zealand
关键词
Dataset; Diving gestures; Gesture recognition; Gesture recognizing glove; Underwater imaging; Image processing; Marine robotics; Image classification; Human-robot interaction; Underwater human-robot interaction;
D O I
10.1016/j.comnet.2024.110392
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper, we present a dataset of diving gesture images used for human-robot interaction underwater. By offering this open access dataset, the paper aims at investigating the potential of using visual detection of diving gestures from an autonomous underwater vehicle (AUV) as a form of communication with a human diver. In addition to the image recording, the same dataset was recorded using a smart gesture recognition glove. The glove uses dielectric elastomer sensors and on -board processing to determine the selected gesture and transmit the command associated with the gesture to the AUV via acoustics. Although this method can be used under different visibility conditions and even without line of sight, it introduces a communication delay required for the acoustic transmission of the gesture command. To compare efficiency, the glove was equipped with visual markers proposed in a gesture -based language called CADDIAN and recorded with an underwater camera in parallel to the glove's onboard recognition process. The dataset contains over 30,000 underwater frames of nearly 900 individual gestures annotated in corresponding snippet folders. The dataset was recorded in a balanced ratio with five different divers in sea and five different divers in pool conditions, with gestures recorded at 1, 2 and 3 metres from the camera. The glove gesture recognition statistics are reported in terms of average diver reaction time, average time taken to perform a gesture, recognition success rate, transmission times and more. The dataset presented should provide a good baseline for comparing the performance of state of the art visual diving gesture recognition techniques under different visibility conditions.
引用
收藏
页数:11
相关论文
共 50 条
  • [31] FGDSNet: A Lightweight Hand Gesture Recognition Network for Human Robot Interaction
    Zhou, Guoyu
    Cui, Zhenchao
    Qi, Jing
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (04): : 3076 - 3083
  • [32] Real-Time Hand Gesture Recognition for Human Robot Interaction
    Correa, Mauricio
    Ruiz-del-Solar, Javier
    Verschae, Rodrigo
    Lee-Ferny, Jong
    Castillo, Nelson
    ROBOCUP 2009: ROBOT SOCCER WORLD CUP XIII, 2010, 5949 : 46 - 57
  • [33] A NEW DATASET FOR HAND GESTURE ESTIMATION
    Shao, Biyao
    Xie, Yifeng
    Yang, Hongnan
    Jiang, Yatong
    Yan, Chenggang
    Xie, Hongtao
    Wang, Yangang
    2017 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP 2017), 2017, : 1388 - 1392
  • [34] Underwater Gesture Recognition Meta-Gloves for Marine Immersive Communication
    Liu, Jiaxu
    Wang, Lihong
    Xu, Ruidong
    Zhang, Xinwei
    Zhao, Jisheng
    Liu, Hong
    Chen, Fuxing
    Qu, Lijun
    Tian, Mingwei
    ACS NANO, 2024, 18 (16) : 10818 - 10828
  • [35] Gesture MNIST: A New Free-Hand Gesture Dataset
    Schak, Monika
    Gepperth, Alexander
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2022, PT IV, 2022, 13532 : 657 - 668
  • [36] An Efficient Solution for Hand Gesture Recognition from Video Sequence
    Prodan, Remus-Catalin
    Pentiuc, Stefan-Gheorghe
    Vatavu, Radu-Daniel
    ADVANCES IN ELECTRICAL AND COMPUTER ENGINEERING, 2012, 12 (03) : 85 - 88
  • [37] HGM-4: A new multi-cameras dataset for hand gesture recognition
    Vinh Truong Hoang
    DATA IN BRIEF, 2020, 30
  • [38] Hand posture recognition in gesture-based human-robot interaction
    Yin, Xiaoming
    Zhu, Xing
    ICIEA 2006: 1ST IEEE CONFERENCE ON INDUSTRIAL ELECTRONICS AND APPLICATIONS, VOLS 1-3, PROCEEDINGS, 2006, : 397 - 402
  • [39] A Method for Hand Gesture Recognition
    Shukla, Jaya
    Dwivedi, Ashutosh
    2014 FOURTH INTERNATIONAL CONFERENCE ON COMMUNICATION SYSTEMS AND NETWORK TECHNOLOGIES (CSNT), 2014, : 919 - 923
  • [40] Hand posture recognition in gesture-based human-robot interaction
    Yin, Xiaoming
    Zhu, Xing
    2006 1ST IEEE CONFERENCE ON INDUSTRIAL ELECTRONICS AND APPLICATIONS, VOLS 1-3, 2006, : 835 - +