mmGesture: Semi-supervised gesture recognition system using mmWave radar

被引:26
作者
Yan, Baiju [1 ]
Wang, Peng [2 ]
Du, Lidong [2 ]
Chen, Xianxiang [2 ]
Fang, Zhen [2 ,3 ,4 ]
Wu, Yirong [2 ,3 ]
机构
[1] Shanghai Jiao Tong Univ, Sch Elect Informat & Elect Engn, Shanghai, Peoples R China
[2] Chinese Acad Sci AIRCAS, Aerosp Informat Res Inst, Beijing, Peoples R China
[3] Univ Chinese Acad Sci, Sch Elect Elect & Commun Engn, Beijing, Peoples R China
[4] Chinese Acad Med Sci, Personalized Management Chron Resp Dis, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
Gesture recognition; mmWave radar; Semi -supervised learning; -model;
D O I
10.1016/j.eswa.2022.119042
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Gesture recognition has found versatile applications in natural human-computer interaction (HCI). Compared with traditional camera-based or wearable sensors-based solutions, gesture recognition using the millimeter wave (mmWave) radar has attracted growing attention for its characteristics of contact-free, privacy-preserving and less environment-dependence. Recently, most of studies adopted one of the Range Doppler Image (RDI), Range Angle Image (RAI), Doppler Angle Image (DAI) or Micro-Doppler Spectrogram extracted from the raw radar signal as the input of a deep neural network to realize gesture recognition. However, the effectiveness of these four inputs in gesture recognition has attracted little attention so far. Moreover, the lack of large amounts of labeled data restricts the performance of traditional supervised learning network. In this paper, we first conducted extensive experiments to compare the effectiveness of these four inputs in the gesture recognition, respectively. Then we proposed a semi-supervised leaning framework by utilizing few labeled data in the source domain and large amounts of unlabeled data in the target domain. Specially, we combine the p-model and some specific data augmentation tricks on the mmWave signal to realize the domain-independent gesture recognition. Extensive experiments on a public mmWave gesture dataset demonstrate the superior effectiveness of the proposed system.
引用
收藏
页数:12
相关论文
共 54 条
[11]  
Hochreiter S, 1997, NEURAL COMPUT, V9, P1735, DOI [10.1162/neco.1997.9.1.1, 10.1007/978-3-642-24797-2]
[12]  
Hongfei Xue, 2021, MobiSys '21: Proceedings of the 19th Annual International Conference on Mobile Systems, Applications, and Services, P269, DOI 10.1145/3458864.3467679
[13]  
Islam SMM, 2020, IEEE MTT S INT MICR, P783, DOI 10.1109/IMS30576.2020.9223838
[14]   Towards Environment Independent Device Free Human Activity Recognition [J].
Jiang, Wenjun ;
Miao, Chenglin ;
Ma, Fenglong ;
Yao, Shuochao ;
Wang, Yaqing ;
Yuan, Ye ;
Xue, Hongfei ;
Song, Chen ;
Ma, Xin ;
Koutsonikolas, Dimitrios ;
Xu, Wenyao ;
Su, Lu .
MOBICOM'18: PROCEEDINGS OF THE 24TH ANNUAL INTERNATIONAL CONFERENCE ON MOBILE COMPUTING AND NETWORKING, 2018, :289-304
[15]   A Wearable Gesture Recognition Device for Detecting Muscular Activities Based on Air-Pressure Sensors [J].
Jung, Pyeong-Gook ;
Lim, Gukchan ;
Kim, Seonghyok ;
Kong, Kyoungchul .
IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2015, 11 (02) :485-494
[16]  
Lai K, 2018, INT C PATT RECOG, P3451, DOI 10.1109/ICPR.2018.8545718
[17]  
Laine S, 2017, Arxiv, DOI arXiv:1610.02242
[18]   Deep learning [J].
LeCun, Yann ;
Bengio, Yoshua ;
Hinton, Geoffrey .
NATURE, 2015, 521 (7553) :436-444
[19]   Dynamic Gesture Recognition in the Internet of Things [J].
Li, Gongfa ;
Wu, Hao ;
Jiang, Guozhang ;
Xu, Shuang ;
Liu, Honghai .
IEEE ACCESS, 2019, 7 :23713-23724
[20]  
Li YD, 2022, Arxiv, DOI arXiv:2111.06195