Instance Normalization: The Missing Ingredient for Fast Stylization Instance Normalization: The Missing Ingredient for Fast Stylization
Paper summary * Style transfer between images works - in its original form - by iteratively making changes to a content image, so that its style matches more and more the style of a chosen style image. * That iterative process is very slow. * Alternatively, one can train a single feed-forward generator network to apply a style in one forward pass. The network is trained on a dataset of input images and their stylized versions (stylized versions can be generated using the iterative approach). * So far, these generator networks were much faster than the iterative approach, but their quality was lower. * They describe a simple change to these generator networks to increase the image quality (up to the same level as the iterative approach). ### How * In the generator networks, they simply replace all batch normalization layers with instance normalization layers. * Batch normalization normalizes using the information from the whole batch, while instance normalization normalizes each feature map on its own. * Equations * Let `H` = Height, `W` = Width, `T` = Batch size * Batch Normalization: * ![Batch Normalization Equations]( "Batch Normalization Equations") * Instance Normalization * ![Instance Normalization Equations]( "Instance Normalization Equations") * They apply instance normalization at test time too (identically). ### Results * Same image quality as iterative approach (at a fraction of the runtime). * One content image with two different styles using their approach: * ![Example]( "Example")
Instance Normalization: The Missing Ingredient for Fast Stylization
Dmitry Ulyanov and Andrea Vedaldi and Victor Lempitsky
arXiv e-Print archive - 2016 via Local arXiv
Keywords: cs.CV


Summary by Alexander Jung 2 years ago
Your comment: allows researchers to publish paper summaries that are voted on and ranked!

Sponsored by: and