Neural Style Transfer

Neural style transfer is based on the observation that convolutional neural networks provide a semantic encoding of an image in which the first layers extract very basic features such as corners, edges, etc. while later layers represent combinations of such basic features. As these higher features encode information like textures and local geometric details, they can be used to formulate a style transfer.

In general, style transfer is based on a joint optimization of two (most important) aspects represented as different loss terms.

Some examples

In this section, we transform several image and style combinations in order to illustrate, how the algorithm works and how the results look like.

The Atrium

This image shows the Atrium of Welfenschloss, nowadays the main building of the University of Hanover. This beautiful image has been made by Christian A. Schröder (ChristianSchd) and provided to the public.

This image represents the style image. It has been taken from a github repository on style transfer Github, where many other examples of style images can be found.

This is an early iteration of the optimization on this image pair. You can already see both: the blue swirls as well as the original image.

This is a later iteration. Looks great, doesn’t it? You can learn how to do it yourself:

Methodology

Given an input image I, a style image S, and a randomly initialized output image, we use some optimization technique to minimize a loss consisting of two weighted terms: one loss representing that we want the image O to match the content of I and another loss representing that we want the image O to match the style of the image S. We are going to vary the image O in order to obtain better and better representations of a weighted sum of these two losses.

It is, however, not possible to find an image O that is perfect with respect to these two constraints of content and style similarity. Therefore, this tradeoff needs to be played with.

Another blog post Artistic Style Transfer Using Keras and Tensorflow introduces an approach in keras that quite well resembles the original paper Gatys et al.: Image Style Transfer Using Convolutional Neural Networks

Material