Action Recognition Based on Multi-feature Depth Motion Maps

被引:0
|
作者
Wang, Dongli [1 ]
Ou, Fang [1 ]
Zhou, Yan [1 ]
机构
[1] Xiangtan Univ, Coll Informat Engn, Xiangtan, Peoples R China
来源
IECON 2018 - 44TH ANNUAL CONFERENCE OF THE IEEE INDUSTRIAL ELECTRONICS SOCIETY | 2018年
基金
中国国家自然科学基金;
关键词
action recognition; depth motion map; features fusion; information entropy improved PCA; reconstruction error collaborative classifier;
D O I
暂无
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Depth motion maps (DMM), containing abundant information on appearance and motion, are captured from the absolute difference between two consecutive depth video sequences. In this paper, each depth frame is first projected onto three orthogonal planes (front, side, top). Then the DMMf, DMMs and DMMt are generated under the three projection view respectively. In order to describe DMM in local and global, histogram of oriented gradient (HOG), local binary patterns (LBP), a local Gist feature description based on a dense grid are computed respectively. Considering the advantages of features fusion and information entropy quantitative evaluation of the Principal Component Analysis (PCA), three descriptors are weighted and fused based on information entropy improved PCA to represent the depth video. A reconstruction error adaptively weighted combination collaborative classifier based on l(1)-norm and l(2)-norm is employed for action recognition, the adaptively weights are determined by Entropy Method. Experimental results on MSR Action3D dataset show that the present approach has strong robustness, discriminability and stability.
引用
收藏
页码:2683 / 2688
页数:6
相关论文
共 50 条
  • [31] Motion magnification multi-feature relation network for facial microexpression recognition
    Zhang, Jing
    Yan, Boyun
    Du, Xiaohui
    Guo, Quanhao
    Hao, Ruqian
    Liu, Juanxiu
    Liu, Lin
    Ni, Guangming
    Weng, Xiechuan
    Liu, Yong
    COMPLEX & INTELLIGENT SYSTEMS, 2022, 8 (04) : 3363 - 3376
  • [32] Motion magnification multi-feature relation network for facial microexpression recognition
    Jing Zhang
    Boyun Yan
    Xiaohui Du
    Quanhao Guo
    Ruqian Hao
    Juanxiu Liu
    Lin Liu
    Guangming Ni
    Xiechuan Weng
    Yong Liu
    Complex & Intelligent Systems, 2022, 8 : 3363 - 3376
  • [33] Multi-Feature based Hand-Gesture Recognition
    Herath, H. M. S. P. B.
    Ekanayake, M. P. B.
    Godaliyadda, G. M. R. I.
    Wijayakulasooriya, J. V.
    2015 FIFTEENTH INTERNATIONAL CONFERENCE ON ADVANCES IN ICT FOR EMERGING REGIONS (ICTER), 2015, : 63 - 68
  • [34] Multi-Feature Based Emotion Recognition for Video Clips
    Liu, Chuanhe
    Tang, Tianhao
    Lv, Kui
    Wang, Minghao
    ICMI'18: PROCEEDINGS OF THE 20TH ACM INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION, 2018, : 630 - 634
  • [35] Human action recognition based on multi-feature fusion and hierarchical BP-AdaBoost algorithm
    Wu, Z. (zhenyang@seu.edu.cn), 1600, Southeast University (44):
  • [36] Action Recognition from Depth Sequences Using Depth Motion Maps-based Local Binary Patterns
    Chen, Chen
    Jafari, Roozbeh
    Kehtarnavaz, Nasser
    2015 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2015, : 1092 - 1099
  • [37] Robust human action recognition based on depth motion maps and improved convolutional neural network
    Cai, Linqin
    Liu, Xiaolin
    Chen, Fuli
    Xiang, Min
    JOURNAL OF ELECTRONIC IMAGING, 2018, 27 (05)
  • [38] Object tracking based on multi-feature fusion and motion prediction
    Zhou, Zhiyu
    Luo, Kaikai
    Wang, Yaming
    Zhang, Jianxin
    Journal of Computational Information Systems, 2011, 7 (16): : 5940 - 5947
  • [39] Traffic lights detection and recognition based on multi-feature fusion
    Wenhao Wang
    Shanlin Sun
    Mingxin Jiang
    Yunyang Yan
    Xiaobing Chen
    Multimedia Tools and Applications, 2017, 76 : 14829 - 14846
  • [40] Multi-feature gait recognition with DNN based on sEMG signals
    Yao, Ting
    Gao, Farong
    Zhang, Qizhong
    Ma, Yuliang
    MATHEMATICAL BIOSCIENCES AND ENGINEERING, 2021, 18 (04) : 3521 - 3542