A High-Performance Gait Recognition Method Based on n-Fold Bernoulli Theory

被引:3
|
作者
Zhou, Qing [1 ]
Rasol, Jarhinbek [1 ]
Xu, Yuelei [1 ]
Zhang, Zhaoxiang [1 ]
Hu, Lujuan [1 ]
机构
[1] Northwestern Polytech Univ, Unmanned Syst Res Inst, Xian 710072, Peoples R China
关键词
Feature extraction; Gait recognition; Classification algorithms; Three-dimensional displays; Computational modeling; Support vector machines; Deep learning; Least squares methods; Gait characteristics; Kinect v2; Bernoulli theory; least-squares support vector machine; NEURAL-NETWORK; ACCURACY; IMAGE;
D O I
10.1109/ACCESS.2022.3212366
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Gait feature recognition refers to recognizing identities by collecting the characteristics of people when they walk. It shows the advantages of noncontact measurement, concealment, and nonimitability, and it also has good application value in monitoring, security, and company management. This paper utilizes Kinect to collect the three-dimensional coordinate data of human bones. Taking the spatial distances between the bone nodes as features, we solve the problem of placement and angle sensitivity of the camera. We design a fast and high-accuracy classifier based on the One-versus-one (OVO) and One-versus-rest (OVR) multiclassification algorithms derived from a support vector machine (SVM), which can realize the identification of persons without data records, and the number of classifiers is greatly reduced by design optimization. In terms of accuracy optimization, a filter based on n-fold Bernoulli theory is proposed to improve the classification accuracy of the multiclassifier. We select 20000 sets of data for fifty volunteers. Experimental results show that the design in this paper can effectively yield improved classification accuracy, which is 99.8%, and reduce the number of originally required classifiers by 91%-95%.
引用
收藏
页码:115744 / 115757
页数:14
相关论文
共 50 条
  • [21] High-Performance Real-Time Human Activity Recognition Using Machine Learning
    Thottempudi, Pardhu
    Acharya, Biswaranjan
    Moreira, Fernando
    MATHEMATICS, 2024, 12 (22)
  • [22] Gait Recognition Method Based on Hybrid Kernel and Optimized Parameter SVM
    Ni, Jian
    Liang, Libo
    2009 2ND IEEE INTERNATIONAL CONFERENCE ON COMPUTER SCIENCE AND INFORMATION TECHNOLOGY, VOL 4, 2009, : 60 - 63
  • [23] A Novel Method of Gait Recognition Based on Kernel Fisher Discriminant Analysis
    Su, Han
    Yang, Mian
    Xu, Hua
    2008 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS (SMC), VOLS 1-6, 2008, : 830 - +
  • [24] Multimodel-Based Gait Recognition Method with Joint Motion Constraints
    Qi, Yanjun
    Xu, Yi
    He, Xuan
    TRAITEMENT DU SIGNAL, 2024, 41 (01) : 189 - 199
  • [25] Multi-view Gait Recognition Method Based on RBF Network
    Qiu, Yaru
    Song, Yonghong
    BIOMETRIC RECOGNITION, CCBR 2018, 2018, 10996 : 96 - 108
  • [26] Signal modeling for high-performance robust isolated word recognition
    Karnjanadecha, M
    Zahorian, SA
    IEEE TRANSACTIONS ON SPEECH AND AUDIO PROCESSING, 2001, 9 (06): : 647 - 654
  • [27] EEG Signal Based Human Emotion Recognition Brain-computer Interface using Deep Learning and High-Performance Computing
    Singh, Vinay Kumar
    Prakash, Shiv
    Dixit, Pratibha
    Prasad, Mukesh
    WIRELESS PERSONAL COMMUNICATIONS, 2025, 140 (1-2) : 165 - 192
  • [28] YOLOv8n-RSDD: A High-Performance Low-Complexity Rail Surface Defect Detection Network
    Fang, Zhanao
    Li, Liming
    Peng, Lele
    Zheng, Shubin
    Zhong, Qianwen
    Zhu, Ting
    IEEE ACCESS, 2024, 12 : 196249 - 196265
  • [29] Facial Expression Recognition: Utilizing Digital Image Processing, Deep Learning, and High-Performance Computing
    Reveriano, Francisco
    Sakoglu, Unal
    Lu, Jiang
    PEARC '19: PROCEEDINGS OF THE PRACTICE AND EXPERIENCE IN ADVANCED RESEARCH COMPUTING ON RISE OF THE MACHINES (LEARNING), 2019,
  • [30] Gait recognition and intention perception method based on human body model mapping
    Jia X.
    Wang T.
    Liu J.
    Li T.
    Yi Qi Yi Biao Xue Bao/Chinese Journal of Scientific Instrument, 2020, 41 (12): : 236 - 244