Towards a Dynamic Inter-Sensor Correlations Learning Framework for Multi-Sensor-Based Wearable Human Activity Recognition

被引:21
作者
Miao, Shenghuan [1 ]
Chen, Ling [2 ]
Hu, Rong [1 ]
Luo, Yingsong [3 ]
机构
[1] Zhejiang Univ, Coll Comp Sci & Technol, 38 Zheda Rd, Hangzhou 310027, Peoples R China
[2] Zhejiang Univ, Alibaba Zhejiang Univ, Coll Comp Sci & Technol, Joint Res Inst Frontier Technol, 38 Zheda Rd, Hangzhou 310027, Peoples R China
[3] Zhejiang Univ, Sch Software Technol, 38 Zheda Rd, Hangzhou 310027, Peoples R China
来源
PROCEEDINGS OF THE ACM ON INTERACTIVE MOBILE WEARABLE AND UBIQUITOUS TECHNOLOGIES-IMWUT | 2022年 / 6卷 / 03期
关键词
human activity recognition; wearable sensors; graph convolution network; information fusion;
D O I
10.1145/3550331
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Multi-sensor-based wearable human activity recognition (WHAR) is a research hotspot in the field of ubiquitous computing. Extracting effective features from multi-sensor data is essential to improve the performance of activity recognition. Despite the excellent achievements of previous works, the challenge remains for modelling the dynamic correlations between sensors. In this paper, we propose a lightweight yet efficient GCN-based dynamic inter-sensor correlations learning framework called DynamicWHAR for automatically learning the dynamic correlations between sensors. DynamicWHAR is mainly composed of two modules: Initial Feature Extraction and Dynamic Information Interaction. Firstly, Initial Feature Extraction module performs data-to-feature transformation to extract the initial features of each sensor. Subsequently, Dynamic Information Interaction module explicitly models the specific interaction intensity between any two sensors, and performs dynamic information aggregation between sensors by the learned interaction intensity. Extensive experiments on four diverse WHAR datasets and two different resource-constrained devices validate that DynamicWHAR outperforms the SOTA models in both recognition performance and computational complexity.
引用
收藏
页数:25
相关论文
共 58 条
[1]   Attend and Discriminate: Beyond the State-of-the-Art for Human Activity Recognition UsingWearable Sensors [J].
Abedin, Alireza ;
Ehsanpour, Mahsa ;
Shi, Qinfeng ;
Rezatofighi, Hamid ;
Ranasinghe, Damith C. .
PROCEEDINGS OF THE ACM ON INTERACTIVE MOBILE WEARABLE AND UBIQUITOUS TECHNOLOGIES-IMWUT, 2021, 5 (01)
[2]  
Baños O, 2012, UBICOMP'12: PROCEEDINGS OF THE 2012 ACM INTERNATIONAL CONFERENCE ON UBIQUITOUS COMPUTING, P1026
[3]   Activity recognition from user-annotated acceleration data [J].
Bao, L ;
Intille, SS .
PERVASIVE COMPUTING, PROCEEDINGS, 2004, 3001 :1-17
[4]   Attention-Based Deep Learning Framework for Human Activity Recognition With User Adaptation [J].
Buffelli, Davide ;
Vandin, Fabio .
IEEE SENSORS JOURNAL, 2021, 21 (12) :13474-13483
[5]   Eye Movement Analysis for Activity Recognition Using Electrooculography [J].
Bulling, Andreas ;
Ward, Jamie A. ;
Gellersen, Hans ;
Troester, Gerhard .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2011, 33 (04) :741-753
[6]   Realtime Multi-Person 2D Pose Estimation using Part Affinity Fields [J].
Cao, Zhe ;
Simon, Tomas ;
Wei, Shih-En ;
Sheikh, Yaser .
30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, :1302-1310
[7]  
Chen YW, 2016, ADV INTEL SYS RES, V127
[8]   Channel-wise Topology Refinement Graph Convolution for Skeleton-Based Action Recognition [J].
Chen, Yuxin ;
Zhang, Ziqi ;
Yuan, Chunfeng ;
Li, Bing ;
Deng, Ying ;
Hu, Weiming .
2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, :13339-13348
[9]   Elderly activities recognition and classification for applications in assisted living [J].
Chernbumroong, Saisakul ;
Cang, Shuang ;
Atkins, Anthony ;
Yu, Hongnian .
EXPERT SYSTEMS WITH APPLICATIONS, 2013, 40 (05) :1662-1674
[10]  
Edel M, 2016, INT C INDOOR POSIT