Occlusion-Aware Human Mesh Model-Based Gait Recognition

被引:20
作者
Xu, Chi [1 ]
Makihara, Yasushi [1 ]
Li, Xiang [1 ]
Yagi, Yasushi [1 ]
机构
[1] Osaka Univ, Inst Sci & Ind Res, Osaka 5670047, Japan
关键词
Gait recognition; Feature extraction; Image reconstruction; Shape; Three-dimensional displays; Cameras; Videos; Partial occlusion; gait recognition; human mesh model; ROBUST; IDENTIFICATION;
D O I
10.1109/TIFS.2023.3236181
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Partial occlusion of the human body caused by obstacles or a limited camera field of view often occurs in surveillance videos, which affects the performance of gait recognition in practice. Existing methods for gait recognition against occlusion require a bounding box or the height of a full human body as a prerequisite, which is unobserved in occlusion scenarios. In this paper, we propose an occlusion-aware model-based gait recognition method that works directly on gait videos under occlusion without the above-mentioned prerequisite. Specifically, given a gait sequence that only contains non-occluded body parts in the images, we directly fit a skinned multi-person linear (SMPL)-based human mesh model to the input images without any pre-normalization or registration of the human body. We further use the pose and shape features extracted from the estimated SMPL model for recognition purposes, and use the extracted camera parameters in the occlusion attenuation module to reduce intra-subject variation in human model fitting caused by occlusion pattern differences. Experiments on occlusion samples simulated from the OU-MVLP dataset demonstrated the effectiveness of the proposed method, which outperformed state-of-the-art gait recognition methods by about 15% rank-1 identification rate and 2% equal error rate in the identification and verification scenarios, respectively.
引用
收藏
页码:1309 / 1321
页数:13
相关论文
共 50 条
  • [21] Differentiable-Optimization Based Neural Policy for Occlusion-Aware Target Tracking
    Masnavi, Houman
    Singh, Arun Kumar
    Janabi-Sharifi, Farrokh
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (12): : 11714 - 11721
  • [22] Relighting Humans: Occlusion-Aware Inverse Rendering for Full-Body Human Images
    Kanamori, Yoshihiro
    Endo, Yuki
    SIGGRAPH ASIA'18: SIGGRAPH ASIA 2018 TECHNICAL PAPERS, 2018,
  • [23] Relighting Humans: Occlusion-Aware Inverse Rendering for Full-Body Human Images
    Kanamori, Yoshihiro
    Endo, Yuki
    ACM TRANSACTIONS ON GRAPHICS, 2018, 37 (06):
  • [24] Human Gait Recognition Based on Compactness
    Chen, Feng
    Jiang, Jie
    Zhang, Guangjun
    SEVENTH INTERNATIONAL SYMPOSIUM ON INSTRUMENTATION AND CONTROL TECHNOLOGY: SENSORS AND INSTRUMENTS, COMPUTER SIMULATION, AND ARTIFICIAL INTELLIGENCE, 2008, 7127
  • [25] Human Recognition Based on Gait Poses
    Martin-Felez, Raul
    Mollineda, Ramon A.
    Salvador Sanchez, J.
    PATTERN RECOGNITION AND IMAGE ANALYSIS: 5TH IBERIAN CONFERENCE, IBPRIA 2011, 2011, 6669 : 347 - 354
  • [26] Occlusion-aware particle size distribution detection of gravel material based on the improved Bilayer Convolutional Network
    Hu, Yike
    Wang, Jiajun
    Wang, Xiaoling
    Guan, Tao
    CONSTRUCTION AND BUILDING MATERIALS, 2023, 404
  • [27] PoseMapGait: A model-based gait recognition method with pose estimation maps and graph convolutional networks
    Liao, Rijun
    Li, Zhu
    Bhattacharyya, Shuvra S.
    York, George
    NEUROCOMPUTING, 2022, 501 : 514 - 528
  • [28] Human gait recognition based on Caffe deep learning framework
    Wang, Jiwu
    Chen, Feng
    ICAROB 2018: PROCEEDINGS OF THE 2018 INTERNATIONAL CONFERENCE ON ARTIFICIAL LIFE AND ROBOTICS, 2018, : 109 - 111
  • [29] Human gait recognition based on principal curve component analysis
    Su, Han
    Chen, Wei
    Hong, Wen
    WCICA 2006: SIXTH WORLD CONGRESS ON INTELLIGENT CONTROL AND AUTOMATION, VOLS 1-12, CONFERENCE PROCEEDINGS, 2006, : 10270 - 10274
  • [30] Haralick Features for GEI-based Human Gait Recognition
    Lishani, Ait O.
    Boubchir, Larbi
    Bouridane, Ahmed
    2014 26TH INTERNATIONAL CONFERENCE ON MICROELECTRONICS (ICM), 2014, : 36 - 39