Journey into gait biometrics: Integrating deep learning for enhanced pattern recognition

被引:1
作者
Parashar, Anubha [1 ]
Parashar, Apoorva [2 ]
Rida, Imad [3 ]
机构
[1] Manipal Univ Jaipur, Sch Comp & Informat Technol, Jaipur, Rajasthan, India
[2] Mahindra Integrated Business Solut, Emerging Technol, Mumbai, India
[3] Univ Technol Compiegne, BMBI Lab, F-60200 Compiegne, France
关键词
Gait recognition; Biometrics; Deep learning; Surveillance; Pattern recognition; MOTION; SELECTION; VIDEO;
D O I
10.1016/j.dsp.2024.104393
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Exploring Gait Biometrics within the domain of deep learning offers a potent fusion that significantly enhances pattern recognition capabilities. Over the past decade, the evolution of deep learning (DL) pipelines has showcased their effectiveness in overcoming complex challenges within image and signal processing applications. Constructing these pipelines requires a deep understanding of the diverse intermediate layers and their implications. The iterative refinement process involves careful selection and rigorous performance validation of each configuration, demanding significant time and contemplation. Consequently, the task of selecting a robust DL pipeline that excels across various datasets remains challenging. The central objective of this review is to provide guidance to researchers, fostering a comprehensive grasp of distinct gait sensing technologies, while establishing a solid foundation in deep learning concepts. Although gait recognition is a relatively recent development and is yet to find widespread application in real-world scenarios, this article offers a thorough examination of gait biometrics tailored specifically for real-time surveillance applications. Delving into the complexities, it elucidates the crucial parameters governing deep learning pipelines and their nuanced selection to address specific challenges. Through an analysis of recent research articles on deep learning models and their performance across diverse datasets, the review outlines the merits and demerits of various approaches. The ultimate aim is to facilitate the development of an optimized pipeline that seamlessly integrates existing methodologies, enabling the attainment of swift yet precise results for a given problem.
引用
收藏
页数:12
相关论文
共 82 条
  • [51] Data preprocessing and feature selection techniques in gait recognition: A comparative study of machine learning and deep learning approaches
    Parashar, Anubha
    Parashar, Apoorva
    Ding, Weiping
    Shabaz, Mohammad
    Rida, Imad
    [J]. PATTERN RECOGNITION LETTERS, 2023, 172 : 65 - 73
  • [52] Deep learning pipelines for recognition of gait biometrics with covariates: a comprehensive review
    Parashar, Anubha
    Parashar, Apoorva
    Ding, Weiping
    Shekhawat, Rajveer S. S.
    Rida, Imad
    [J]. ARTIFICIAL INTELLIGENCE REVIEW, 2023, 56 (08) : 8889 - 8953
  • [53] Intra-class variations with deep learning-based gait analysis: A comprehensive survey of covariates and methods
    Parashar, Anubha
    Shekhawat, Rajveer Singh
    Ding, Weiping
    Rida, Imad
    [J]. NEUROCOMPUTING, 2022, 505 : 315 - 338
  • [54] Prabhu V., 2017, CVPR 2017 CV-COPS Work, V1
  • [55] Robust gait recognition: a comprehensive survey
    Rida, Imad
    Almaadeed, Noor
    Almaadeed, Somaya
    [J]. IET BIOMETRICS, 2019, 8 (01) : 14 - 28
  • [56] Rida I, 2017, SIGNAL PROC SEC TEC, P141, DOI 10.1007/978-3-319-47301-7_6
  • [57] Rida I, 2016, 2016 39TH INTERNATIONAL CONFERENCE ON TELECOMMUNICATIONS AND SIGNAL PROCESSING (TSP), P652, DOI 10.1109/TSP.2016.7760963
  • [58] Rida I, 2015, EUR SIGNAL PR CONF, P1128, DOI 10.1109/EUSIPCO.2015.7362559
  • [59] Human Body Part Selection by Group Lasso of Motion for Model-Free Gait Recognition
    Rida, Imad
    Jiang, Xudong
    Marcialis, Gian Luca
    [J]. IEEE SIGNAL PROCESSING LETTERS, 2016, 23 (01) : 154 - 158
  • [60] Gait recognition based on modified phase-only correlation
    Rida, Imad
    Almaadeed, Somaya
    Bouridane, Ahmed
    [J]. SIGNAL IMAGE AND VIDEO PROCESSING, 2016, 10 (03) : 463 - 470