SKIP: Accurate Fall Detection Based on Skeleton Keypoint Association and Critical Feature Perception

被引:2
作者
Du, Chenjie [1 ]
Jin, Ran [1 ]
Tang, Hao [2 ]
Jiang, Qiuping [3 ]
He, Zhiwei [4 ]
机构
[1] Zhejiang Wanli Univ, Coll Big Data & Software Engn, Ningbo 315100, Peoples R China
[2] Swiss Fed Inst Technol, Dept Informat Technol & Elect Engn, CH-8092 Zurich, Switzerland
[3] Ningbo Univ, Fac Informat Sci & Engn, Ningbo 315211, Peoples R China
[4] Hangzhou Dianzi Univ, Fac Elect Informat, Hangzhou 310018, Peoples R China
关键词
Critical feature perception (CFP); cross-frame association; fall detection; skeleton keypoints; RECOGNITION;
D O I
10.1109/JSEN.2024.3379167
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
As deep learning technology advances, human fall detection (HFD) leveraging convolutional neural networks (CNNs) has recently garnered significant interest within the research community. However, most existing works ignore the cross-frame association of skeleton keypoints and aggregation of feature representations. To address this, we first introduce an image preprocessing (IPP) module, which enhances the foreground and weakens the background. Diverging from common practices that employ the off-the-shelf detector for target position estimation, our skeleton keypoint detection and association (SKDA) module is designed to detect and cross-frame associate the skeleton keypoints with high affinity. This design reduces the misleading impact of ambiguous detections and ensures the continuity of long-range trajectories. Further, our critical feature perception (CFP) module is crafted to help the model learn more discriminative feature representations for human activity classification. Incorporating these components mentioned above, we introduce SKIP, a novel human fall detection approach, showcasing improved detection precision. Evaluations on the publicly available telecommunication system team v2 (TSTv2) and self-build datasets show SKIP's superior performance.
引用
收藏
页码:14812 / 14824
页数:13
相关论文
共 54 条
[41]   A ubiquitous wheelchair fall detection system using low-cost embedded inertial sensors and unsupervised one-class SVM [J].
Sheikh, Sofia Yousuf ;
Jilani, Muhammad Taha .
JOURNAL OF AMBIENT INTELLIGENCE AND HUMANIZED COMPUTING, 2021, 14 (1) :147-162
[42]  
Subasi A., 2020, Innovation in health informatics, P123
[43]   SpectralSpatial Feature Tokenization Transformer for Hyperspectral Image Classification [J].
Sun, Le ;
Zhao, Guangrui ;
Zheng, Yuhui ;
Wu, Zebin .
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60
[44]   SWFormer: Sparse Window Transformer for 3D Object Detection in Point Clouds [J].
Sun, Pei ;
Tan, Mingxing ;
Wang, Weiyue ;
Liu, Chenxi ;
Xia, Fei ;
Leng, Zhaoqi ;
Anguelov, Dragomir .
COMPUTER VISION, ECCV 2022, PT X, 2022, 13670 :426-442
[45]   UNetFormer: A UNet-like transformer for efficient semantic segmentation of remote sensing urban scene imagery [J].
Wang, Libo ;
Li, Rui ;
Zhang, Ce ;
Fang, Shenghui ;
Duan, Chenxi ;
Meng, Xiaoliang ;
Atkinson, Peter M. .
ISPRS JOURNAL OF PHOTOGRAMMETRY AND REMOTE SENSING, 2022, 190 :196-214
[46]   RGB-D-based human motion recognition with deep learning: A survey [J].
Wang, Pichao ;
Li, Wanqing ;
Ogunbona, Philip ;
Wan, Jun ;
Escalera, Sergio .
COMPUTER VISION AND IMAGE UNDERSTANDING, 2018, 171 :118-139
[47]   Leveraging temporal-aware fine-grained features for robust multiple object tracking [J].
Wu, Han ;
Nie, Jiahao ;
Zhu, Ziming ;
He, Zhiwei ;
Gao, Mingyu .
JOURNAL OF SUPERCOMPUTING, 2023, 79 (03) :2910-2931
[48]  
XIAO H, 2013, INT C CYB TECHN, P234
[49]   ARFDNet: An efficient activity recognition & fall detection system using latent feature pooling [J].
Yadav, Santosh Kumar ;
Luthra, Achleshwar ;
Tiwari, Kamlesh ;
Pandey, Hari Mohan ;
Akbar, Shaik Ali .
KNOWLEDGE-BASED SYSTEMS, 2022, 239
[50]   A Hierarchical Clustering Approach to Fuzzy Semantic Representation of Rare Words in Neural Machine Translation [J].
Yang, Muyun ;
Liu, Shujie ;
Chen, Kehai ;
Zhang, Hongyang ;
Zhao, Enbo ;
Zhao, Tiejun .
IEEE TRANSACTIONS ON FUZZY SYSTEMS, 2020, 28 (05) :992-1002