MASS: A Multisource Domain Adaptation Network for Cross-Subject Touch Gesture Recognition

被引:6
作者
Li, Yun-Kai [1 ]
Meng, Qing-Hao [1 ]
Wang, Ya-Xin [1 ]
Yang, Tian-Hao [1 ]
Hou, Hui-Rang [1 ]
机构
[1] Tianjin Univ, Inst Robot & Autonomous Syst, Sch Elect & Informat Engn, Tianjin Key Lab Proc Measurement & Control, Tianjin 300072, Peoples R China
基金
中国博士后科学基金;
关键词
Feature extraction; Silicon; Three-dimensional displays; Task analysis; Kernel; Training; Testing; Cross-subject; human-robot tactile interaction; multisource domain adaption; touch gesture recognition (TGR);
D O I
10.1109/TII.2022.3174063
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Touch gesture recognition (TGR) plays a pivotal role in many applications, such as socially assistive robots and embodied telecommunication. However, one obstacle to practicality of existing TGR methods is the individual disparities across subjects. Moreover, a deep neural network trained with multiple existing subjects can easily lead to overfitting for a new subject. Hence, how to mitigate the discrepancies between the new and existing subjects and establish a generalized network for TGR is a significant task to realize reliable human-robot tactile interaction. In this article, a novel framework for Multisource domain Adaptation via Shared-Specific feature projection (MASS) is proposed, which incorporates intradomain discriminant, multidomain discriminant, and cross-domain consistency into a deep learning network for cross-subject TGR. Specifically, the MASS method first extracts the shared features in the common feature space of training subjects, with which a domain-general classifier is built. Then, the specific features of each pair of training and testing subjects are mapped and aligned in their common feature space, and multiple domain-specific classifiers are trained with the specific features. Finally, the domain-general classifier and domain-specific classifiers are ensembled to predict the label for the touch samples of a new subject. Experimental results performed on two datasets show that our proposed MASS method achieves remarkable results for cross-subject TGR. The code of MASS is available at https://github.com/AI-touch/MASS.
引用
收藏
页码:3099 / 3108
页数:10
相关论文
共 27 条
[21]   A deep multi-source adaptation transfer network for cross-subject electroencephalogram emotion recognition [J].
Wang, Fei ;
Zhang, Weiwei ;
Xu, Zongfeng ;
Ping, Jingyu ;
Chu, Hao .
NEURAL COMPUTING & APPLICATIONS, 2021, 33 (15) :9061-9073
[22]   Multiple source domain adaptation in micro-expression recognition [J].
Zhang, Xiaorui ;
Xu, Tong ;
Sun, Wei ;
Song, Aiguo .
JOURNAL OF AMBIENT INTELLIGENCE AND HUMANIZED COMPUTING, 2021, 12 (08) :8371-8386
[23]  
Zhao H., 2018, P INT C LEARN REPR, P1
[24]  
Zhao H., 2018, Advances in neural information processing systems, V31
[25]  
Zhao S., 2019, P AAAI C ART INT, P1295
[26]   Cross-Modal Surface Material Retrieval Using Discriminant Adversarial Learning [J].
Zheng, Wendong ;
Liu, Huaping ;
Wang, Bowen ;
Sun, Fuchun .
IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2019, 15 (09) :4978-4987
[27]  
Zhu YC, 2019, AAAI CONF ARTIF INTE, P5989