Human Activity Recognition Based on Point Clouds from Millimeter-Wave Radar

被引:1
作者
Lim, Seungchan [1 ]
Park, Chaewoon [1 ]
Lee, Seongjoo [2 ,3 ]
Jung, Yunho [1 ,4 ]
机构
[1] Korea Aerosp Univ, Sch Elect & Informat Engn, Goyang 10540, South Korea
[2] Sejong Univ, Dept Elect Engn, Seoul 05006, South Korea
[3] Sejong Univ, Dept Convergence Engn Intelligent Drone, Seoul 05006, South Korea
[4] Korea Aerosp Univ, Dept Smart Air Mobil, Goyang 10540, South Korea
来源
APPLIED SCIENCES-BASEL | 2024年 / 14卷 / 22期
关键词
millimeter-wave radar; 3D point cloud; human activity recognition; field-programmable gate array;
D O I
10.3390/app142210764
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
Human activity recognition (HAR) technology is related to human safety and convenience, making it crucial for it to infer human activity accurately. Furthermore, it must consume low power at all times when detecting human activity and be inexpensive to operate. For this purpose, a low-power and lightweight design of the HAR system is essential. In this paper, we propose a low-power and lightweight HAR system using point-cloud data collected by radar. The proposed HAR system uses a pillar feature encoder that converts 3D point-cloud data into a 2D image and a classification network based on depth-wise separable convolution for lightweighting. The proposed classification network achieved an accuracy of 95.54%, with 25.77 M multiply-accumulate operations and 22.28 K network parameters implemented in a 32 bit floating-point format. This network achieved 94.79% accuracy with 4 bit quantization, which reduced memory usage to 12.5% compared to existing 32 bit format networks. In addition, we implemented a lightweight HAR system optimized for low-power design on a heterogeneous computing platform, a Zynq UltraScale+ ZCU104 device, through hardware-software implementation. It took 2.43 ms of execution time to perform one frame of HAR on the device and the system consumed 3.479 W of power when running.
引用
收藏
页数:13
相关论文
共 32 条
[1]   Maximum Entropy Markov Model for Human Activity Recognition Using Depth Camera [J].
Alrashdi, Ibrahim ;
Siddiqi, Muhammad Hameed ;
Alhwaiti, Yousef ;
Alruwaili, Madallah ;
Azad, Mohammad .
IEEE ACCESS, 2021, 9 :160635-160645
[2]  
[Anonymous], Smart Radar System
[3]  
[Anonymous], 2019, Xavier Developer Kit|NVIDIA Developer
[4]  
Ayala R., 2021, J. Auto. Vehicles Syst., V1, P031003, DOI [10.1115/1.4052991, DOI 10.1115/1.4052991]
[5]   Multi-Input Deep Learning Based FMCW Radar Signal Classification [J].
Cha, Daewoong ;
Jeong, Sohee ;
Yoo, Minwoo ;
Oh, Jiyong ;
Han, Dongseog .
ELECTRONICS, 2021, 10 (10)
[6]   Deep Learning for Sensor-based Human Activity Recognition: Overview, Challenges, and Opportunities [J].
Chen, Kaixuan ;
Zhang, Dalin ;
Yao, Lina ;
Guo, Bin ;
Yu, Zhiwen ;
Liu, Yunhao .
ACM COMPUTING SURVEYS, 2021, 54 (04)
[7]   Sparsity-Based Human Activity Recognition With PointNet Using a Portable FMCW Radar [J].
Ding, Chuanwei ;
Zhang, Li ;
Chen, Haoyu ;
Hong, Hong ;
Zhu, Xiaohua ;
Fioranelli, Francesco .
IEEE INTERNET OF THINGS JOURNAL, 2023, 10 (11) :10024-10037
[8]   Wearable Sensor-Based Human Activity Recognition with Transformer Model [J].
Dirgova Luptakova, Iveta ;
Kubovcik, Martin ;
Pospichal, Jiri .
SENSORS, 2022, 22 (05)
[9]  
Howard AG, 2017, Arxiv, DOI [arXiv:1704.04861, 10.48550/arXiv.1704.04861]
[10]  
Ghosh A., 2019, Journal of Ambient Intelligence and Humanized Computing, DOI DOI 10.1007/S12652-019-01260-Y