Deep Neural Representation Guided Face Sketch Synthesis

Bin Sheng, Ping Li, Chenhao Gao, Kwan-Liu Ma

Research output: Contribution to journalArticle

5 Scopus citations

Abstract

Face sketch synthesis shows great applications in a lot of fields such as online entertainment and suspects identification. Existing face sketch synthesis methods learn the patch-wise sketch style from the training dataset containing photo-sketch pairs. These methods manipulate the whole process directly in the field of RGB space, which unavoidably results in unsmooth noises at patch boundaries. If denoising methods are used, the sketch edges would be blurred and face structures could not be restored. Recent researches of feature maps, which are the outputs of a certain neural network layer, have achieved great success in texture synthesis and artistic image generation. In this paper, we reformulate the face sketch synthesis problem into a neural network feature maps based optimization task. Our results accurately capture the sketch drawing style and make full use of the whole stylistic information hidden in the training dataset. Unlike former feature map based methods, we utilize the Enhanced 3D PatchMatch and cross-layer cost aggregation methods to obtain the target feature maps for the final results. Multiple experiments have shown that our approach imitates hand-drawn sketch style vividly, and has high-quality visual effects on CUHK, AR, XM2VTS and CUFSF face sketch datasets.

Original languageEnglish (US)
JournalIEEE Transactions on Visualization and Computer Graphics
DOIs
StateAccepted/In press - Aug 17 2018

Keywords

  • convolutional neural network (CNN)
  • face sketch synthesis
  • Non-photorealistic rendering
  • style transformation

ASJC Scopus subject areas

  • Software
  • Signal Processing
  • Computer Vision and Pattern Recognition
  • Computer Graphics and Computer-Aided Design

Fingerprint Dive into the research topics of 'Deep Neural Representation Guided Face Sketch Synthesis'. Together they form a unique fingerprint.

Cite this