Multi-Sensor Data Fusion and CNN-LSTM Model for Human Activity Recognition System

被引:9
作者
Zhou, Haiyang [1 ]
Zhao, Yixin [1 ]
Liu, Yanzhong [1 ]
Lu, Sichao [1 ]
An, Xiang [1 ,2 ]
Liu, Qiang [1 ]
机构
[1] Beijing Inst Petrochem Technol, Acad Artificial Intelligence, Beijing 102617, Peoples R China
[2] Beijing Acad Safety Engn & Technol, Beijing 102617, Peoples R China
关键词
human activity recognition; multi-sensor data fusion; fusion algorithm; CNN-LSTM; RADAR;
D O I
10.3390/s23104750
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Human activity recognition (HAR) is becoming increasingly important, especially with the growing number of elderly people living at home. However, most sensors, such as cameras, do not perform well in low-light environments. To address this issue, we designed a HAR system that combines a camera and a millimeter wave radar, taking advantage of each sensor and a fusion algorithm to distinguish between confusing human activities and to improve accuracy in low-light settings. To extract the spatial and temporal features contained in the multisensor fusion data, we designed an improved CNN-LSTM model. In addition, three data fusion algorithms were studied and investigated. Compared to camera data in low-light environments, the fusion data significantly improved the HAR accuracy by at least 26.68%, 19.87%, and 21.92% under the data level fusion algorithm, feature level fusion algorithm, and decision level fusion algorithm, respectively. Moreover, the data level fusion algorithm also resulted in a reduction of the best misclassification rate to 2%similar to 6%. These findings suggest that the proposed system has the potential to enhance the accuracy of HAR in low-light environments and to decrease human activity misclassification rates.
引用
收藏
页数:28
相关论文
共 32 条
  • [1] Background-Subtraction Algorithm Optimization for Home Camera-Based Night-Vision Fall Detectors
    Alonso, Mercedes
    Brunete, Alberto
    Hernando, Miguel
    Gambao, Ernesto
    [J]. IEEE ACCESS, 2019, 7 : 152399 - 152411
  • [2] Action Recognition From Thermal Videos
    Batchuluun, Ganbayar
    Nguyen, Dat Tien
    Tuyen Danh Pham
    Park, Chanhum
    Park, Kang Ryoung
    [J]. IEEE ACCESS, 2019, 7 : 103893 - 103917
  • [3] Deep Learning Radar Design for Breathing and Fall Detection
    Bhattacharya, Abhijit
    Vaughan, Rodney
    [J]. IEEE SENSORS JOURNAL, 2020, 20 (09) : 5072 - 5085
  • [4] Brezmes T, 2009, LECT NOTES COMPUT SC, V5518, P796, DOI 10.1007/978-3-642-02481-8_120
  • [5] Feature Selection for Wearable Smartphone-Based Human Activity Recognition with Able bodied, Elderly, and Stroke Patients
    Capela, Nicole A.
    Lemaire, Edward D.
    Baddour, Natalie
    [J]. PLOS ONE, 2015, 10 (04):
  • [6] A Survey on Activity Detection and Classification Using Wearable Sensors
    Cornacchia, Maria
    Ozcan, Koray
    Zheng, Yu
    Velipasalar, Senem
    [J]. IEEE SENSORS JOURNAL, 2017, 17 (02) : 386 - 403
  • [7] Home Camera-Based Fall Detection System for the Elderly
    de Miguel, Koldo
    Brunete, Alberto
    Hernando, Miguel
    Gambao, Ernesto
    [J]. SENSORS, 2017, 17 (12)
  • [8] Erol B, 2017, IEEE RAD CONF, P819, DOI 10.1109/RADAR.2017.7944316
  • [9] Human fall detection using slow feature analysis
    Fan, Kaibo
    Wang, Ping
    Zhuang, Shuo
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2019, 78 (07) : 9101 - 9128
  • [10] Convolutional Two-Stream Network Fusion for Video Action Recognition
    Feichtenhofer, Christoph
    Pinz, Axel
    Zisserman, Andrew
    [J]. 2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, : 1933 - 1941