Robust Nonparametric Distribution Transfer with Exposure Correction for Image Neural Style Transfer

被引:0
作者
Liu, Shuai [1 ]
Hong, Caixia [1 ]
He, Jing [1 ]
Tian, Zhiqiang [1 ]
机构
[1] Xi An Jiao Tong Univ, Sch Software Engn, Xian 710049, Peoples R China
基金
中国博士后科学基金; 中国国家自然科学基金;
关键词
robust nonparametric distribution transfer; exposure correction; neural style transfer;
D O I
10.3390/s20185232
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Image neural style transfer is a process of utilizing convolutional neural networks to render a content image based on a style image. The algorithm can compute a stylized image with original content from the given content image but a new style from the given style image. Style transfer has become a hot topic both in academic literature and industrial applications. The stylized results of current existing models are not ideal because of the color difference between two input images and the inconspicuous details of content image. To solve the problems, we propose two style transfer models based on robust nonparametric distribution transfer. The first model converts the color probability density function of the content image into that of the style image before style transfer. When the color dynamic range of the content image is smaller than that of style image, this model renders more reasonable spatial structure than the existing models. Then, an adaptive detail-enhanced exposure correction algorithm is proposed for underexposed images. Based this, the second model is proposed for the style transfer of underexposed content images. It can further improve the stylized results of underexposed images. Compared with popular methods, the proposed methods achieve the satisfactory qualitative and quantitative results.
引用
收藏
页码:1 / 19
页数:19
相关论文
共 53 条
  • [1] [Anonymous], 2018, INT J COMPUT VISION, DOI DOI 10.1007/s11263-018-1089-z
  • [2] [Anonymous], 2016, CVPR
  • [3] [Anonymous], 2006, Non-photorealistic rendering in context: an observational study
  • [4] [Anonymous], 2017, P S NONPH AN REND NP, DOI DOI 10.1145/3092919.3092920
  • [5] [Anonymous], 2017, IEEE I CONF COMP VIS, DOI DOI 10.1109/ICCV.2017.244
  • [6] [Anonymous], 2017, ARXIV170909828
  • [7] Champandard A. J., 2016, arXiv preprint arXiv:1603.01768
  • [8] PairedCycleGAN: Asymmetric Style Transfer for Applying and Removing Makeup
    Chang, Huiwen
    Lu, Jingwan
    Yu, Fisher
    Finkelstein, Adam
    [J]. 2018 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2018, : 40 - 48
  • [9] Stereoscopic Neural Style Transfer
    Chen, Dongdong
    Yuan, Lu
    Liao, Jing
    Yu, Nenghai
    Hua, Gang
    [J]. 2018 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2018, : 6654 - 6663
  • [10] Gated-GAN: Adversarial Gated Networks for Multi-Collection Style Transfer
    Chen, Xinyuan
    Xu, Chang
    Yang, Xiaokang
    Song, Li
    Tao, Dacheng
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2019, 28 (02) : 546 - 560