Real-Time Detection of Fall From Bed Using a Single Depth Camera

被引:42
作者
Zhao, Feng [1 ]
Cao, Zhiguo [1 ]
Xiao, Yang [1 ]
Mao, Jing [2 ]
Yuan, Junsong [3 ]
机构
[1] Huazhong Univ Sci & Technol, Sch Automat, Natl Key Lab Sci & Technol Multispectral Informat, Wuhan 430074, Hubei, Peoples R China
[2] Huazhong Univ Sci & Technol, Tongji Med Coll, Sch Nursing, Wuhan 430030, Hubei, Peoples R China
[3] SUNY Buffalo, Univ Buffalo, Comp Sci & Engn Dept, Buffalo, NY 14260 USA
基金
对外科技合作项目(国际科技项目); 中国国家自然科学基金;
关键词
Depth camera; depth comparison feature; fall from bed detection; human body detection; large margin nearest neighbor (LMNN); random forest; PEOPLE;
D O I
10.1109/TASE.2018.2861382
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Toward the medical and living healthcare for the elderly and patients, fall from bed is a critical accident that may lead to serious injuries. To alleviate this, an essential problem is to detect this event in time for earning the rescue time. Although some efforts that resort to the wearable devices and smart healthcare room have already been paid to address this problem, the performance is still not satisfactory enough for the practical applications. In this paper, a novel fall from a bed detection method is proposed. In particular, the depth camera is used as the visual sensor due to its insensitivity to illumination variation and capacity of privacy protection. To characterize the human activity well, an effective human upper body detection approach able to extract human head and upper body center is proposed using random forest. Compared with the existing widely used human body parsing methods (e.g., Microsoft Kinect SDK or OpenNI SDK), our proposition can still work reliably when human bed interaction happens. According to the motion information of human upper body, the fall from bed detection task is formulated as a two-class classification problem. Then, it is solved using the large margin nearest neighbor classification approach. Our method can meet the real-time running requirement with the normal computer. In experiments, we construct a fall from bed detection data set that contains the samples from 42 volunteers (26 males and 16 females) for test. The experimental results demonstrate the effectiveness and efficiency of our proposition. Note to Practitioners This paper aims to develop the vision-based surveillance system to automatically detect fall from bed in real time under the unconstrained conditions. This cannot be well handled by using wearable devices. Our proposition can be used to construct a smart healthcare room. It is worth noting that the proposed system is insensitive to illumination variation and capable of protecting privacy due to the use of depth camera. The proposed human body extraction method is able to address the human bed interaction well. An effective motion pattern categorization approach is applied to ensure the good separation between fall from bed and the other activities. According to the extensive experiments under the laboratory and sickroom environments, our fall from bed detection method generally can achieve acceptable result with high efficiency. Nevertheless, it is still somewhat sensitive to noise within the depth frames. In addition, when serious human quilt interaction happens, the performance of human body extraction procedure is not satisfactory enough.
引用
收藏
页码:1018 / 1032
页数:15
相关论文
共 50 条
  • [21] Real-time Low-energy Fall Detection Algorithm with a Programmable Truncated MAC
    Solaz, Manuel de la Guia
    Bourke, Alan
    Conway, Richard
    Nelson, John
    OLaighin, Gearoid
    [J]. 2010 ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY (EMBC), 2010, : 2423 - 2426
  • [22] A Portable Augmented-Reality Anatomy Learning System Using a Depth Camera in Real Time
    Manrique-Juan, Cristina
    Grostieta-Dominguez, Zaira V. E.
    Rojas-Ruiz, Ricardo
    Alencastre-Miranda, Moises
    Munoz-Gomez, Lourdes
    Silva-Munoz, Cecilia
    [J]. AMERICAN BIOLOGY TEACHER, 2017, 79 (03) : 176 - 183
  • [23] VNect: Real-time 3D Human Pose Estimation with a Single RGB Camera
    Mehta, Dushyant
    Sridhar, Srinath
    Sotnychenko, Oleksandr
    Rhodin, Helge
    Shafiei, Mohammad
    Seidel, Hans-Peter
    Xu, Weipeng
    Casas, Dan
    Theobalt, Christian
    [J]. ACM TRANSACTIONS ON GRAPHICS, 2017, 36 (04):
  • [24] Real-Time Multiview SAR Imaging Using a Portable Microwave Camera With Arbitrary Movement
    Laviada, Jaime
    Ghasr, Mohammad Tayeb
    Lopez-Portugues, Miguel
    Las-Heras, Fernando
    Zoughi, Reza
    [J]. IEEE TRANSACTIONS ON ANTENNAS AND PROPAGATION, 2018, 66 (12) : 7305 - 7314
  • [25] Real-Time Head Pose Estimation and Face Modeling From a Depth Image
    Luo, Changwei
    Zhang, Juyong
    Yu, Jun
    Chen, Chang Wen
    Wang, Shengjin
    [J]. IEEE TRANSACTIONS ON MULTIMEDIA, 2019, 21 (10) : 2473 - 2481
  • [26] Real-time Non-rigid Reconstruction using an RGB-D Camera
    Zollhoefer, Michael
    Niessner, Matthias
    Izadi, Shahram
    Rehmann, Christoph
    Zach, Christopher
    Fisher, Matthew
    Wu, Chenglei
    Fitzgibbon, Andrew
    Loop, Charles
    Theobalt, Christian
    Stamminger, Marc
    [J]. ACM TRANSACTIONS ON GRAPHICS, 2014, 33 (04):
  • [27] Real time fall detection in fog computing scenario
    Shrivastava, Rashmi
    Pandey, Manju
    [J]. CLUSTER COMPUTING-THE JOURNAL OF NETWORKS SOFTWARE TOOLS AND APPLICATIONS, 2020, 23 (04): : 2861 - 2870
  • [28] Face Mask Identification Using Spatial and Frequency Features in Depth Image from Time-of-Flight Camera
    Wang, Xiaoyan
    Xu, Tianxu
    An, Dong
    Sun, Lei
    Wang, Qiang
    Pan, Zhongqi
    Yue, Yang
    [J]. SENSORS, 2023, 23 (03)
  • [29] Real-Time Morphological Measurement of Oriental Melon Fruit Through Multi-Depth Camera Three-Dimensional Reconstruction
    Hong, Suk-Ju
    Kim, Jinse
    Lee, Ahyeong
    [J]. FOOD AND BIOPROCESS TECHNOLOGY, 2024, 17 (12) : 5038 - 5052
  • [30] Real-Time Detection of Cryptocurrency Mining Behavior
    Ye, Ke
    Shen, Meng
    Gao, Zhenbo
    Zhu, Liehuang
    [J]. BLOCKCHAIN AND TRUSTWORTHY SYSTEMS, BLOCKSYS 2022, 2022, 1679 : 278 - 291