When Video meets Inertial Sensors: Zero-shot Domain Adaptation for Finger Motion Analytics with Inertial Sensors

被引:29
作者
Liu, Yilin [1 ]
Zhang, Shijia [1 ]
Gowda, Mahanth [1 ]
机构
[1] Penn State Univ, University Pk, PA 16802 USA
来源
PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON INTERNET-OF-THINGS DESIGN AND IMPLEMENTATION, IOTDI 2021 | 2021年
关键词
IoT; Wearable; IMU; Data argumentation; Finger gesture;
D O I
10.1145/3450268.3453537
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Ubiquitous finger motion tracking enables a number of exciting applications in augmented reality, sports analytics, rehabilitation-healthcare etc. While finger motion tracking with cameras is very mature, largely due to availability of massive training datasets, there is a dearth of training data for developing robust machine learning (ML) models for wearable IoT devices with Inertial Measurement Unit (IMU) sensors. Towards addressing this problem, this paper presents ZeroNet, a system that shows the feasibility of developing ML models for IMU sensors with zero training overhead. ZeroNet harvests training data from publicly available videos for performing inferences on IMU. The difference in data among video and IMU domains introduces a number of challenges due to differences in sensor-camera coordinate systems, body sizes of users, speed/orientation changes during gesturing, sensor position variations etc. ZeroNet addresses these challenges by systematically extracting motion data from videos and transforming them into acceleration and orientation information measured by IMU sensors. Furthermore, data-augmentation techniques are exploited that create synthetic variations in the harvested training data to enhance the generalizability and robustness of the ML models to user diversity. Evaluation with 10 users demonstrates a top-1 accuracy of 82.4% and a top-3 accuracy of 94.8% for recognition of 50 finger gestures thus indicating promise. While we have only scratched the surface, we outline a number of interesting possibilities for extending this work in the cross-disciplinary areas of computer vision, machine learning, and wearable IoT for enabling novel applications in finger motion tracking.
引用
收藏
页码:182 / 194
页数:13
相关论文
共 82 条
[1]   Towards a brain-computer interface for dexterous control of a multi-fingered prosthetic hand [J].
Acharya, Soumyadipta ;
Aggarwal, Vikram ;
Tenore, Francesco ;
Shin, Hyun-Chool ;
Etienne-Cummings, Ralph ;
Schieber, Marc H. ;
Thakor, Nitish V. .
2007 3RD INTERNATIONAL IEEE/EMBS CONFERENCE ON NEURAL ENGINEERING, VOLS 1 AND 2, 2007, :200-+
[2]   Impairment of individual finger movements in Parkinson's disease [J].
Agostino, R ;
Currà, A ;
Giovannelli, M ;
Modugno, N ;
Manfredi, M ;
Berardelli, A .
MOVEMENT DISORDERS, 2003, 18 (05) :560-565
[3]   A Review on Systems-Based Sensory Gloves for Sign Language Recognition State of the Art between 2007 and 2017 [J].
Ahmed, Mohamed Aktham ;
Zaidan, Bilal Bahaa ;
Zaidan, Aws Alaa ;
Salih, Mahmood Maher ;
Bin Lakulu, Muhammad Modi .
SENSORS, 2018, 18 (07)
[4]  
android, Profile battery usage with batterystats and battery historian
[5]  
[Anonymous], 2018, Oura Ring Review
[6]  
[Anonymous], 2020, Vicon-award winning motion capture system
[7]  
[Anonymous], 2013, P 19 ANN INT C MOB C, DOI DOI 10.1145/2500423.2500436
[8]  
[Anonymous], 2019, Oura ring review-the early adopter catches the worm
[9]  
[Anonymous], 2019, Oura ring-what we learned about the sleep tracking ring
[10]  
[Anonymous], 2020, Google project soli