Cross-view gait recognition based on a restrictive triplet network

被引:13
|
作者
Tong, Sui-bing [1 ]
Fu, Yu-zhuo [1 ]
Ling, He-fei [2 ]
机构
[1] Shanghai Jiao Tong Univ, 800 Dong Chuan Rd, Shanghai 200240, Peoples R China
[2] Huazhong Univ Sci & Technol, 1037 Luoyu Rd, Wuhan 430074, Hubei, Peoples R China
关键词
Cross-view; Gait recognition; View variations; RTN; PERFORMANCE; MODEL;
D O I
10.1016/j.patrec.2019.04.010
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
To overcome the influence of view variations, a restrictive triplet network (RTN) is proposed to solve the problem of cross-view gait recognition in this paper. This network comprises five shared convolutional layers. The restrictive triplet loss is adopted to optimize RTN based on the triplet-based sample groups. These gait samples are selected by a special strategy, so as to make RTN converges faster. The model optimized by this method is adopted to extract the view-invariant feature for cross-view gait recognition. Besides, two additional networks named BDN and TDN are proposed to compare with RTN, which match the adjacent features at different convolutional layers. Finally, extensive evaluations are conducted based on the CASIA-B, OU-ISIR and USF dataset. Experimental results indicate that RTN performs best. Besides, the state-of-the-art methods are selected to compare with RTN. Among them, RTN achieves the best recognition score, which reaches 94.62% under singe view angle and 91.68% under cross-view angle, respectively. The results demonstrate that RTN is robust against view variations, which shows the great potential of RTN for practical applications in the future. (C) 2019 Elsevier B.V. All rights reserved.
引用
收藏
页码:212 / 219
页数:8
相关论文
共 50 条
  • [1] Attention-Based Network for Cross-View Gait Recognition
    Huang, Yuanyuan
    Zhang, Jianfu
    Zhao, Haohua
    Zhang, Liqing
    NEURAL INFORMATION PROCESSING (ICONIP 2018), PT VII, 2018, 11307 : 489 - 498
  • [2] Cross-View Gait Recognition Based on Dual-Stream Network
    Zhao, Xiaoyan
    Zhang, Wenjing
    Zhang, Tianyao
    Zhang, Zhaohui
    JOURNAL OF ADVANCED COMPUTATIONAL INTELLIGENCE AND INTELLIGENT INFORMATICS, 2021, 22 (05) : 671 - 678
  • [3] TAG: A Temporal Attentive Gait Network for Cross-View Gait Recognition
    Shakeel, M. Saad
    Liu, Kun
    Liao, Xiaochuan
    Kang, Wenxiong
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2025, 74
  • [4] Two-Stream Gait Network for Cross-View Gait Recognition
    Wang K.
    Lei Y.
    Zhang J.
    Moshi Shibie yu Rengong Zhineng/Pattern Recognition and Artificial Intelligence, 2020, 33 (05): : 383 - 392
  • [5] Cross-View Gait Recognition Based on Feature Fusion
    Hong, Qi
    Wang, Zhongyuan
    Chen, Jianyu
    Huang, Baojin
    2021 IEEE 33RD INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE (ICTAI 2021), 2021, : 640 - 646
  • [6] Cross-View Gait Recognition Based on U-Net
    Tifiini Alvarez, Israel Raul
    Sahonero-Alvarez, Guillermo
    2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [7] Cross-view gait recognition based on human walking trajectory
    Chen, Xian
    Yang, Tianqi
    Xu, Jiaming
    JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2014, 25 (08) : 1842 - 1855
  • [8] GaitSet: Regarding Gait as a Set for Cross-View Gait Recognition
    Chao, Hanqing
    He, Yiwei
    Zhang, Junping
    Feng, Jianfeng
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 8126 - 8133
  • [9] Cross-View Gait Recognition Method Based on Multi-branch Residual Deep Network
    Hu S.
    Wang X.
    Liu Y.
    Moshi Shibie yu Rengong Zhineng/Pattern Recognition and Artificial Intelligence, 2021, 34 (05): : 455 - 462
  • [10] Gait based cross-view pedestrian tracking with camera network
    Song S.
    Wan J.
    Beijing Hangkong Hangtian Daxue Xuebao/Journal of Beijing University of Aeronautics and Astronautics, 2023, 49 (08): : 2154 - 2166