Edit Propagation Using Deep Neural Network from a Single Image

被引:0
作者
Gui Y. [1 ,2 ]
Guo L. [1 ,2 ]
Zeng G. [1 ,2 ]
机构
[1] School of Computer & Communication Engineering, Changsha University of Science & Technology, Changsha
[2] Hunan Provincial Key Laboratory of Intelligent Processing of Big Data on Transportation, Changsha University of Science and Technology, Changsha
来源
Jisuanji Fuzhu Sheji Yu Tuxingxue Xuebao/Journal of Computer-Aided Design and Computer Graphics | 2019年 / 31卷 / 08期
关键词
Deep neural network; Edit propagation; Fully connected random field model; Image appearance editing;
D O I
10.3724/SP.J.1089.2019.17558
中图分类号
学科分类号
摘要
This paper proposes a novel edit propagation approach using deep neural network (DNN) from a single image, which aims to handle the problems such as appropriate features choosing and manual feature tuning. Firstly, we transform user interactions into distance maps which are then concatenated into the input image to create a new image with multiple channels, combining low-level visual features with spatial features. Secondly, we extract small multi-channel patches and use them as input of a DNN that extracts deep features adapted to user interactions. And the DNN can perform a joint end-to-end learning of visual feature and spatial feature for edit propagation, which automatically determines the importance of image features. Finally, we use the DNN as a classifier to estimate probabilities of all image pixels, and obtain editing results with high quality through further post-processing. The experimental results on MARA 1k database demonstrate that our method can respond to user interactions well and perform significantly better to propagate image edits. © 2019, Beijing China Science Journal Publishing Co. Ltd. All right reserved.
引用
收藏
页码:1391 / 1402
页数:11
相关论文
共 25 条
  • [1] Levin A., Lischinski D., Weiss Y., Colorization using optimization, ACM Transactions on Graphics, 23, 3, pp. 689-694, (2004)
  • [2] Yatziv L., Sapiro G., Fast image and video colorization using chrominance blending, IEEE Transactions on Image Processing, 15, 5, pp. 1120-1129, (2006)
  • [3] Qu Y.G., Wong T.T., Heng P.A., Manga colorization, ACM Transactions on Graphics, 25, 3, pp. 1214-1220, (2006)
  • [4] Luan Q., Wen F., Cohen-Or D., Et al., Natural image colorization, Proceedings of the 18th Eurographics Conference on Rendering Techniques, pp. 309-320, (2007)
  • [5] Lischinski D., Farbman Z., Uyttendaele M., Et al., Interactive local adjustment of tonal values, ACM Transactions on Graphics, 25, 3, pp. 646-653, (2006)
  • [6] An X.B., Pellacini F., AppProp: all-pairs appearance-space edit propagation, ACM Transactions on Graphics, 27, 3, (2008)
  • [7] Xu K., Li Y., Ju T., Et al., Efficient affinity-based edit propagation using K-D tree, ACM Transactions on Graphics, 28, 5, (2009)
  • [8] Li Y., Ju T., Hu S.M., Instant propagation of sparse edits on images and videos, Computer Graphics Forum, 29, 7, pp. 2049-2054, (2010)
  • [9] Chen X.W., Zou D.Q., Zhao Q.P., Et al., Manifold preserving edit propagation, ACM Transactions on Graphics, 31, 6, (2012)
  • [10] Xu L., Yan Q., Jia J.Y., A sparse control model for image and video editing, ACM Transactions on Graphics, 32, 6, (2013)