GestureGAN for Hand Gesture-to-Gesture Translation in the Wild

被引:51
作者
Tang, Hao [1 ]
Wang, Wei [1 ,2 ]
Xu, Dan [1 ,3 ]
Yan, Yan [4 ]
Sebe, Nicu [1 ]
机构
[1] Univ Trento, Dept Informat Engn & Comp Sci, Trento, Italy
[2] Ecole Polytech Fed Lausanne, Comp Vis Lab, Lausanne, Switzerland
[3] Univ Oxford, Dept Engn Sci, Oxford, England
[4] Texas State Univ, Dept Comp Sci, San Marcos, TX USA
来源
PROCEEDINGS OF THE 2018 ACM MULTIMEDIA CONFERENCE (MM'18) | 2018年
关键词
Generative Adversarial Networks; Image Translation; Hand Gesture;
D O I
10.1145/3240508.3240704
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Hand gesture-to-gesture translation in the wild is a challenging task since hand gestures can have arbitrary poses, sizes, locations and self-occlusions. Therefore, this task requires a high-level understanding of the mapping between the input source gesture and the output target gesture. To tackle this problem, we propose a novel hand Gesture Generative Adversarial Network (GestureGAN). GestureGAN consists of a single generator G and a discriminator D, which takes as input a conditional hand image and a target hand skeleton image. GestureGAN utilizes the hand skeleton information explicitly, and learns the gesture-to-gesture mapping through two novel losses, the color loss and the cycle-consistency loss. The proposed color loss handles the issue of "channel pollution" while back-propagating the gradients. In addition, we present the Frechet ResNet Distance (FRD) to evaluate the quality of generated images. Extensive experiments on two widely used benchmark datasets demonstrate that the proposed GestureGAN achieves state-of-the-art performance on the unconstrained hand gesture-to-gesture translation task. Meanwhile, the generated images are in high-quality and are photo-realistic, allowing them to be used as data augmentation to improve the performance of a hand gesture classifier. Our model and code are available at https://github.com/Ha0Tang/GestureGAN.
引用
收藏
页码:774 / 782
页数:9
相关论文
共 62 条
[1]  
[Anonymous], 2018, ICPR
[2]  
[Anonymous], 2016, ECCV
[3]  
[Anonymous], 2017, ICCV
[4]  
[Anonymous], 2017, ARXIV171203474
[5]  
[Anonymous], 2017, CVPR
[6]  
[Anonymous], 2018, CVPR
[7]  
[Anonymous], 2018, ABS180201822 CORR
[8]  
[Anonymous], 2018, CVPR
[9]  
[Anonymous], 2017, NIPS
[10]  
[Anonymous], 2017, BEGAN BOUNDARY EQUIL