MASS: A Multisource Domain Adaptation Network for Cross-Subject Touch Gesture Recognition

被引:6
作者
Li, Yun-Kai [1 ]
Meng, Qing-Hao [1 ]
Wang, Ya-Xin [1 ]
Yang, Tian-Hao [1 ]
Hou, Hui-Rang [1 ]
机构
[1] Tianjin Univ, Inst Robot & Autonomous Syst, Sch Elect & Informat Engn, Tianjin Key Lab Proc Measurement & Control, Tianjin 300072, Peoples R China
基金
中国博士后科学基金;
关键词
Feature extraction; Silicon; Three-dimensional displays; Task analysis; Kernel; Training; Testing; Cross-subject; human-robot tactile interaction; multisource domain adaption; touch gesture recognition (TGR);
D O I
10.1109/TII.2022.3174063
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Touch gesture recognition (TGR) plays a pivotal role in many applications, such as socially assistive robots and embodied telecommunication. However, one obstacle to practicality of existing TGR methods is the individual disparities across subjects. Moreover, a deep neural network trained with multiple existing subjects can easily lead to overfitting for a new subject. Hence, how to mitigate the discrepancies between the new and existing subjects and establish a generalized network for TGR is a significant task to realize reliable human-robot tactile interaction. In this article, a novel framework for Multisource domain Adaptation via Shared-Specific feature projection (MASS) is proposed, which incorporates intradomain discriminant, multidomain discriminant, and cross-domain consistency into a deep learning network for cross-subject TGR. Specifically, the MASS method first extracts the shared features in the common feature space of training subjects, with which a domain-general classifier is built. Then, the specific features of each pair of training and testing subjects are mapped and aligned in their common feature space, and multiple domain-specific classifiers are trained with the specific features. Finally, the domain-general classifier and domain-specific classifiers are ensembled to predict the label for the touch samples of a new subject. Experimental results performed on two datasets show that our proposed MASS method achieves remarkable results for cross-subject TGR. The code of MASS is available at https://github.com/AI-touch/MASS.
引用
收藏
页码:3099 / 3108
页数:10
相关论文
共 27 条
[1]   A Hand-Modeled Feature Extraction-Based Learning Network to Detect Grasps Using sEMG Signal [J].
Baygin, Mehmet ;
Barua, Prabal Datta ;
Dogan, Sengul ;
Tuncer, Turker ;
Key, Sefa ;
Acharya, U. Rajendra ;
Cheong, Kang Hao .
SENSORS, 2022, 22 (05)
[2]   Multisource-Refined Transfer Network for Industrial Fault Diagnosis Under Domain and Category Inconsistencies [J].
Chai, Zheng ;
Zhao, Chunhui ;
Huang, Biao .
IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (09) :9784-9796
[3]  
Chakma A. Z., 2021, Smart Health, V19
[4]  
Donahue J, 2014, PR MACH LEARN RES, V32
[5]   Dynamic Hand Gesture Recognition Based on Signals From Specialized Data Glove and Deep Learning Algorithms [J].
Dong, Yongfeng ;
Liu, Jielong ;
Yan, Wenjie .
IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2021, 70
[6]  
Ganin Y, 2016, J MACH LEARN RES, V17
[7]   Modeling Development of Multimodal Emotion Perception Guided by Tactile Dominance and Perceptual Improvement [J].
Horii, Takato ;
Nagai, Yukie ;
Asada, Minoru .
IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS, 2018, 10 (03) :762-775
[8]   Transfer Learning in Brain-Computer Interfaces [J].
Jayaram, Vinay ;
Alamgir, Morteza ;
Altun, Yasemin ;
Schoelkopf, Bernhard ;
Grosse-Wentrup, Moritz .
IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE, 2016, 11 (01) :20-31
[9]   Automatic recognition of touch gestures in the corpus of social touch [J].
Jung, Merel M. ;
Poel, Mannes ;
Poppe, Ronald ;
Heylen, Dirk K. J. .
JOURNAL ON MULTIMODAL USER INTERFACES, 2017, 11 (01) :81-96
[10]  
King DB, 2015, ACS SYM SER, V1214, P1, DOI 10.1021/bk-2015-1214.ch001