ENHANCING HUMAN ACTIVITY RECOGNITION THROUGH SENSOR FUSION AND HYBRID DEEP LEARNING MODEL

被引:4
作者
Tarekegn, Adane Nega [1 ]
Ullah, Mohib [1 ]
Cheikh, Faouzi Alaya [1 ]
Sajjad, Muhammad [1 ]
机构
[1] Norwegian Univ Sci & Technol NTNU, Software Data & Digital Environm SDDE Res Grp, Dept Comp Sci, Gjovik, Norway
来源
2023 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING WORKSHOPS, ICASSPW | 2023年
关键词
sensor fusion; human activity recognition; deep learning; smart belt; wearable sensor;
D O I
10.1109/ICASSPW59220.2023.10193698
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
Wearable-based human activity recognition (HAR) is essential for several applications, such as health monitoring, physical training, and rehabilitation. However, most HAR systems presently depend on a single sensor, typically a smartphone, due to its widespread use. To improve performance and adapt to various scenarios, this study focuses on a smart belt equipped with acceleration and gyroscope sensors for detecting activities of daily living (ADLs). The collected data was pre-processed, fused and used to train a hybrid deep learning model incorporating a CNN and BiLSTM network. We evaluated the effect of window length on recognition accuracy and conducted a performance analysis of the proposed model. Our framework achieved an overall accuracy of 96% at a window length of 5 seconds, demonstrating its effectiveness in recognizing ADLs. The results show that belt sensor fusion for HAR provides valuable insights into human behaviour and could enhance applications such as healthcare, fitness, and sports training.
引用
收藏
页数:5
相关论文
共 18 条
[1]   Window Size Impact in Human Activity Recognition [J].
Banos, Oresti ;
Galvez, Juan-Manuel ;
Damas, Miguel ;
Pomares, Hector ;
Rojas, Ignacio .
SENSORS, 2014, 14 (04) :6474-6499
[2]   A multibranch CNN-BiLSTM model for human activity recognition using wearable sensor data [J].
Challa, Sravan Kumar ;
Kumar, Akhilesh ;
Semwal, Vijay Bhaskar .
VISUAL COMPUTER, 2022, 38 (12) :4095-4109
[3]   Sensor-based and vision-based human activity recognition: A comprehensive survey [J].
Dang, L. Minh ;
Min, Kyungbok ;
Wang, Hanxiang ;
Piran, Md. Jalil ;
Lee, Cheol Hee ;
Moon, Hyeonjoon .
PATTERN RECOGNITION, 2020, 108
[4]   A Public Domain Dataset for Real-Life Human Activity Recognition Using Smartphone Sensors [J].
Garcia-Gonzalez, Daniel ;
Rivero, Daniel ;
Fernandez-Blanco, Enrique ;
Luaces, Miguel R. .
SENSORS, 2020, 20 (08)
[5]   Human activity recognition in artificial intelligence framework: a narrative review [J].
Gupta, Neha ;
Gupta, Suneet K. ;
Pathak, Rajesh K. ;
Jain, Vanita ;
Rashidi, Parisa ;
Suri, Jasjit S. .
ARTIFICIAL INTELLIGENCE REVIEW, 2022, 55 (06) :4755-4808
[6]  
Heimerl A, 2019, INT CONF AFFECT, DOI [10.1109/acii.2019.8925519, 10.1109/ACII.2019.8925519]
[7]   Human Daily and Sport Activity Recognition Using a Wearable inertial Sensor Network [J].
Hsu, Yu-Liang ;
Yang, Shih-Chin ;
Chang, Hsing-Cheng ;
Lai, Hung-Che .
IEEE ACCESS, 2018, 6 :31715-31728
[8]  
J Bharath Singh, 2021, Journal of Physics: Conference Series, DOI 10.1088/1742-6596/1979/1/012057
[9]   Human Activity Recognition With Smartphone and Wearable Sensors Using Deep Learning Techniques: A Review [J].
Ramanujam, E. ;
Perumal, Thinagaran ;
Padmavathi, S. .
IEEE SENSORS JOURNAL, 2021, 21 (12) :13029-13040
[10]  
Sehrawat D., 2020, Advances in Science, Technology and Engineering Systems Journal, V5, P516, DOI DOI 10.25046/AJ050461