Optimal Sensor Placement and Multimodal Fusion for Human Activity Recognition in Agricultural Tasks

被引:2
|
作者
Benos, Lefteris [1 ]
Tsaopoulos, Dimitrios [1 ]
Tagarakis, Aristotelis C. [1 ]
Kateris, Dimitrios [1 ]
Bochtis, Dionysis [1 ,2 ]
机构
[1] Ctr Res & Technol Hellas CERTH, Inst Bioecon & Agritechnol IBO, GR-57001 Thessaloniki, Greece
[2] FarmB Digital Agr, Doiraniis 17, Thessaloniki GR-54639, Greece
来源
APPLIED SCIENCES-BASEL | 2024年 / 14卷 / 18期
关键词
Long Short-Term Memory (LSTM) networks; wearable sensors; multi-sensor information fusion; human-robot collaboration; human factors; cost-optimal system configuration; ROBOTS; SPINE;
D O I
10.3390/app14188520
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
This study examines the impact of sensor placement and multimodal sensor fusion on the performance of a Long Short-Term Memory (LSTM)-based model for human activity classification taking place in an agricultural harvesting scenario involving human-robot collaboration. Data were collected from twenty participants performing six distinct activities using five wearable inertial measurement units placed at various anatomical locations. The signals collected from the sensors were first processed to eliminate noise and then input into an LSTM neural network for recognizing features in sequential time-dependent data. Results indicated that the chest-mounted sensor provided the highest F1-score of 0.939, representing superior performance over other placements and combinations of them. Moreover, the magnetometer surpassed the accelerometer and gyroscope, highlighting its superior ability to capture crucial orientation and motion data related to the investigated activities. However, multimodal fusion of accelerometer, gyroscope, and magnetometer data showed the benefit of integrating data from different sensor types to improve classification accuracy. The study emphasizes the effectiveness of strategic sensor placement and fusion in optimizing human activity recognition, thus minimizing data requirements and computational expenses, and resulting in a cost-optimal system configuration. Overall, this research contributes to the development of more intelligent, safe, cost-effective adaptive synergistic systems that can be integrated into a variety of applications.
引用
收藏
页数:16
相关论文
共 50 条
  • [41] Reducing the Impact of Sensor Orientation Variability in Human Activity Recognition Using a Consistent Reference System
    Gil-Martin, Manuel
    Lopez-Iniesta, Javier
    Fernandez-Martinez, Fernando
    San-Segundo, Ruben
    SENSORS, 2023, 23 (13)
  • [42] Multitask LSTM Model for Human Activity Recognition and Intensity Estimation Using Wearable Sensor Data
    Barut, Onur
    Zhou, Li
    Luo, Yan
    IEEE INTERNET OF THINGS JOURNAL, 2020, 7 (09) : 8760 - 8768
  • [43] VALERIAN: Invariant Feature Learning for IMU Sensor-based Human Activity Recognition in the Wild
    Hao, Yujiao
    Wang, Boyu
    Zheng, Rong
    PROCEEDINGS 8TH ACM/IEEE CONFERENCE ON INTERNET OF THINGS DESIGN AND IMPLEMENTATION, IOTDI 2023, 2023, : 66 - 78
  • [44] Learning Disentangled Representation for Mixed-Reality Human Activity Recognition With a Single IMU Sensor
    Xia, Songpengcheng
    Chu, Lei
    Pei, Ling
    Zhang, Zixuan
    Yu, Wenxian
    Qiu, Robert C.
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2021, 70
  • [45] Challenges in Sensor-based Human Activity Recognition and a Comparative Analysis of Benchmark Datasets: A Review
    Das Antar, Anindya
    Ahmed, Masud
    Ahad, Md Atiqur Rahman
    2019 JOINT 8TH INTERNATIONAL CONFERENCE ON INFORMATICS, ELECTRONICS & VISION (ICIEV) AND 2019 3RD INTERNATIONAL CONFERENCE ON IMAGING, VISION & PATTERN RECOGNITION (ICIVPR) WITH INTERNATIONAL CONFERENCE ON ACTIVITY AND BEHAVIOR COMPUTING (ABC), 2019, : 134 - 139
  • [46] Optimizing On-Body Sensor Placements for Deep Learning-Driven Human Activity Recognition
    Mekruksavanich, Sakorn
    Jitpattanakul, Anuchit
    COMPUTATIONAL SCIENCE AND ITS APPLICATIONS-ICCSA 2024, PT II, 2024, 14814 : 327 - 338
  • [47] Motion Units: Generalized Sequence Modeling of Human Activities for Sensor-Based Activity Recognition
    Liu, Hui
    Hartmann, Yale
    Schultz, Tanj A.
    29TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2021), 2021, : 1506 - 1510
  • [48] ubiMonitor: Intelligent Fusion of Body-worn Sensors for Real-time Human Activity Recognition
    Aly, Heba
    Ismail, Mohamed A.
    30TH ANNUAL ACM SYMPOSIUM ON APPLIED COMPUTING, VOLS I AND II, 2015, : 563 - 568
  • [49] Towards a Dynamic Inter-Sensor Correlations Learning Framework for Multi-Sensor-Based Wearable Human Activity Recognition
    Miao, Shenghuan
    Chen, Ling
    Hu, Rong
    Luo, Yingsong
    PROCEEDINGS OF THE ACM ON INTERACTIVE MOBILE WEARABLE AND UBIQUITOUS TECHNOLOGIES-IMWUT, 2022, 6 (03):
  • [50] MLCNNwav: Multilevel Convolutional Neural Network With Wavelet Transformations for Sensor-Based Human Activity Recognition
    Dahou, Abdelghani
    Al-Qaness, Mohammed A. A.
    Elaziz, Mohamed Abd
    Helmi, Ahmed M.
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (01) : 820 - 828