A Lightweight GAN Network for Large Scale Fingerprint Generation

被引:20
作者
Fahim, Masud An-Nur Islam [1 ]
Jung, Ho Yub [1 ]
机构
[1] Chosun Univ, Dept Comp Engn, Gwangju 61452, South Korea
基金
新加坡国家研究基金会;
关键词
Gallium nitride; Training; Generative adversarial networks; Generators; Stability analysis; Image resolution; Measurement; Skip connection; spectral normalization; loss doping; mode collapse; diversity; ADVERSARIAL NETWORK;
D O I
10.1109/ACCESS.2020.2994371
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Generating fingerprint images for biometric purposes is both necessary and challenging. In this study, we presented a fingerprint generation approach based on generative adversarial network. To ensure GAN training stability, we have introduced conditional loss doping that allows a continuous flow of gradients. Our study utilizes a careful combination of a residual network and spectral normalization to generate fingerprints. The proposed average residual connection shows more immunity against vanishing gradients than a simple residual connection. Spectral normalization allows our network to enjoy reduced variance in weight generation, which further stabilizes the training. Proposed scheme uses spectral bounding only in the input and the fully connected layers. Our network synthesized fingerprints up to 256 by 256 in size. We used the multi-scale structural similarity (MS-SSIM) metric for measuring the diversity of the generated samples. Our model has achieved 0.23 MS-SSIM scores for the generated fingerprints. The MS-SSIM score indicates that the proposed scheme is more likely to produce more diverse images and less likely to face mode collapse.
引用
收藏
页码:92918 / 92928
页数:11
相关论文
共 48 条
[1]  
[Anonymous], 2018, ARXIV180104406
[2]  
[Anonymous], CORR
[3]  
[Anonymous], ARXIV200104296
[4]  
[Anonymous], 2017, P INT C LEARN REPR
[5]  
[Anonymous], 2018, ICML
[6]  
[Anonymous], P INT C LEARN REPR V
[7]  
[Anonymous], ADV NEURAL INFORM PR
[8]  
[Anonymous], 2016, ARXIV161101673
[9]  
[Anonymous], ARXIV190508474
[10]  
Berthelot David, 2017, arXiv, DOI DOI 10.48550/ARXIV.1703.10717