Skip to content

Artistic style transfer model based on Gatys et al paper

Notifications You must be signed in to change notification settings

SamLynnEvans/style_transfer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

style_transfer

Neural style transfer allows the style of one image to be imposed upon the content of another. The technique was first proposed by Gatys et al and has so far been used to create images in the fashion of particularly stylistic artists.

In this method a content image and a style image are put through layers of a pretrained image-identification model (in this case VGG-16). As the layers get deeper, the model begins to pay attention to more abstract and stylistic features of the image. A loss function compares these abstract features obtained from the stye image to the content image, and changes the content image to make it more similar.

content style


After reading the paper, I implementing the model myself. Here's an example outcome with Kaganawa's waves and my github profile picture.

2j6w69

To use it yourself, run the jupyter notebook, changing PathyStyle to the path to the style image and PathContent to the path of your content image. Change the ratio of how heavily style and content loss are penalised when you call train.

Let me know if you use it, or make any excellent images!

About

Artistic style transfer model based on Gatys et al paper

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published