Extracting human attributes using a convolutional neural network approach

被引:34
|
作者
Perlin, Hugo Alberto [1 ]
Lopes, Heitor Silverio [2 ]
机构
[1] Parana Fed Inst Parana, Paranagua, PR, Brazil
[2] Univ Tecnol Fed Parana, Curitiba, Parana, Brazil
关键词
Computer vision; Machine learning; Soft-biometrics; Convolutional Neural Network; Gender recognition; Clothes parsing; CLASSIFICATION; FEATURES; SCALE;
D O I
10.1016/j.patrec.2015.07.012
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Extracting high level information from digital images and videos is a hard problem frequently faced by the computer vision and machine learning communities. Modern surveillance systems can monitor people, cars or objects by using computer vision methods. The objective of this work is to propose a method for identifying soft biometrics, in the form of clothing and gender, from images containing people, as a previous step for further identifying people themselves. We propose a solution to this classification problem using a Convolutional Neural Network, working as an all-in-one feature extractor and classifier. This method allows the development of a high-level end-to-end clothing/gender classifier. Experiments were done comparing the CNN with hand-designed classifiers. Also, two different operating modes of CNN are proposed and coin pared each other. The results obtained were very promising, showing that is possible to extract soft-biometrics attributes using an end-to-end CNN classifier. The proposed method achieved a good generalization capability, classifying the three different attributes with good accuracy. This suggests the possibility to search images using soft biometrics as search terms. (C) 2015 Elsevier B.V. All rights reserved.
引用
收藏
页码:250 / 259
页数:10
相关论文
共 50 条
  • [1] Extracting Crop Spatial Distribution from Gaofen 2 Imagery Using a Convolutional Neural Network
    Chen, Yan
    Zhang, Chengming
    Wang, Shouyi
    Li, Jianping
    Li, Feng
    Yang, Xiaoxia
    Wang, Yuanyuan
    Yin, Leikun
    APPLIED SCIENCES-BASEL, 2019, 9 (14):
  • [2] Gender Recognition from Facial Images using Convolutional Neural Network
    Mittal, Shubham
    Mittal, Shiva
    2019 FIFTH INTERNATIONAL CONFERENCE ON IMAGE INFORMATION PROCESSING (ICIIP 2019), 2019, : 347 - 352
  • [3] Human action interpretation using convolutional neural network: a survey
    Malik, Zainab
    Bin Shapiai, Mohd Ibrahim
    MACHINE VISION AND APPLICATIONS, 2022, 33 (03)
  • [4] A Novel Convolutional Neural Network with Glial Cells
    Korytkowski, Marcin
    ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING, (ICAISC 2016), PT II, 2016, 9693 : 670 - 679
  • [5] An Approach for Biometric Verification Based on Human Body Communication using Convolutional Neural Network
    Li, Jingzhen
    Liu, Yuhang
    Igbe, Tobore
    Nie, Zedong
    2019 IEEE 9TH INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS (ICCE-BERLIN), 2019, : 12 - 15
  • [6] A novel approach for detection of dyslexia using convolutional neural network with EOG signals
    Ileri, Ramis
    Latifoglu, Fatma
    Demirci, Esra
    MEDICAL & BIOLOGICAL ENGINEERING & COMPUTING, 2022, 60 (11) : 3041 - 3055
  • [7] Rolling element bearing fault diagnosis using convolutional neural network and vibration image
    Hoang, Duy-Tang
    Kang, Hee-Jun
    COGNITIVE SYSTEMS RESEARCH, 2019, 53 : 42 - 50
  • [8] Extracting Wetland Type Information with a Deep Convolutional Neural Network
    Guan, XianMing
    Wang, Di
    Wan, Luhe
    Zhang, Jiyi
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2022, 2022
  • [9] VIDEO SYSTEM FOR HUMAN ATTRIBUTE ANALYSIS USING COMPACT CONVOLUTIONAL NEURAL NETWORK
    Yang, Yi
    Chen, Feng
    Chen, Xiaoming
    Dai, Yan
    Chen, Zhenyang
    Ji, Jiang
    Zhao, Tong
    2016 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2016, : 584 - 588
  • [10] Bangla Handwritten Digit Recognition Using Convolutional Neural Network
    Rabby, A. K. M. Shahariar Azad
    Abujar, Sheikh
    Haque, Sadeka
    Hossain, Syed Akhter
    EMERGING TECHNOLOGIES IN DATA MINING AND INFORMATION SECURITY, IEMIS 2018, VOL 1, 2019, 755 : 111 - 122