Unsupervised Domain Adaptation for WiFi Gesture Recognition

被引:3
作者
Zhang, Bin-Bin [1 ]
Zhang, Dongheng [1 ,2 ,3 ]
Hu, Yang [1 ]
Chen, Yan [1 ,2 ,3 ]
机构
[1] Univ Sci & Technol China, Sch Cyber Sci & Technol, Hefei, Peoples R China
[2] Minist Culture & Tourism, Key Lab Cyberspace Cultural Content Cognit Commun, Hefei, Peoples R China
[3] Hefei Comprehens Natl Sci Ctr, Inst Dataspace, Hefei, Peoples R China
来源
2023 IEEE WIRELESS COMMUNICATIONS AND NETWORKING CONFERENCE, WCNC | 2023年
基金
中国博士后科学基金; 中国国家自然科学基金;
关键词
Gesture Recognition; Cross Domain; Unsupervised Domain Adaptation; WiFi Sensing;
D O I
10.1109/WCNC55385.2023.10118941
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Human gesture recognition with WiFi signals has attained acclaim due to the omnipresence, privacy protection, and broad coverage nature of WiFi signals. These gesture recognition systems rely on neural networks trained with a large number of labeled data. However, the recognition model trained with data under certain conditions would suffer from significant performance degradation when applied in practical deployment, which limits the application of gesture recognition systems. In this paper, we propose UDAWiGR, an unsupervised domain adaptation framework for WiFi-based gesture recognition aiming to enhance the performance of the recognition model in new conditions by making effective use of the unlabeled data from new conditions. We first propose a pseudo-labeling method with confidence control constraint to utilize unlabeled data for model training. We then utilize consistency regularization to align the output distribution for enhancing the robustness of neural network under signal perturbations. Furthermore, we propose a cross-match loss to combine the pseudo-labeling and consistency regularization, which makes the whole framework simple yet effective. Extensive experiments demonstrate that the proposed framework could achieve 4.35% accuracy improvement comparing with the state-of-the-art methods on public dataset.
引用
收藏
页数:6
相关论文
共 15 条
[1]  
Berthelot D., 2019, arXiv
[2]   A Tutorial on Human Activity Recognition Using Body-Worn Inertial Sensors [J].
Bulling, Andreas ;
Blanke, Ulf ;
Schiele, Bernt .
ACM COMPUTING SURVEYS, 2014, 46 (03)
[3]   Detecting and Recognizing Human-Object Interactions [J].
Gkioxari, Georgia ;
Girshick, Ross ;
Dollar, Piotr ;
He, Kaiming .
2018 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2018, :8359-8367
[4]   Towards Environment Independent Device Free Human Activity Recognition [J].
Jiang, Wenjun ;
Miao, Chenglin ;
Ma, Fenglong ;
Yao, Shuochao ;
Wang, Yaqing ;
Yuan, Ye ;
Xue, Hongfei ;
Song, Chen ;
Ma, Xin ;
Koutsonikolas, Dimitrios ;
Xu, Wenyao ;
Su, Lu .
MOBICOM'18: PROCEEDINGS OF THE 24TH ANNUAL INTERNATIONAL CONFERENCE ON MOBILE COMPUTING AND NETWORKING, 2018, :289-304
[5]   On the momentum term in gradient descent learning algorithms [J].
Qian, N .
NEURAL NETWORKS, 1999, 12 (01) :145-151
[6]  
Sohn K, 2020, Arxiv, DOI arXiv:2001.07685
[7]   Position and Orientation Agnostic Gesture Recognition Using WiFi [J].
Virmani, Aditya ;
Shahzad, Muhammad .
MOBISYS'17: PROCEEDINGS OF THE 15TH ANNUAL INTERNATIONAL CONFERENCE ON MOBILE SYSTEMS, APPLICATIONS, AND SERVICES, 2017, :252-264
[8]   Device-Free Human Activity Recognition Using Commercial WiFi Devices [J].
Wang, Wei ;
Liu, Alex X. ;
Shahzad, Muhammad ;
Ling, Kang ;
Lu, Sanglu .
IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2017, 35 (05) :1118-1131
[9]  
Xie QZ, 2020, Arxiv, DOI arXiv:1904.12848
[10]  
Yongsen Ma, 2018, Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, V2, DOI 10.1145/3191755