This project aims to deal with the flickering problem caused by naively applying per-frame stylization methods (e.g., Fast-Neural-Style and AdaIN) on videos.
1. Background In 2016, Gatys et al. are the first to propose an image style transfer algorithm using deep neural networks, which is capable of transforming artistic style (e.g., colours, textures and brush strokes) from a given artistic image to arbitrary photos. The visual appealing results and elegant design of their approach motivate many researchers to dig in this field which is called Neural Artistic Style Transfer by followers.
This project aims to implement a torch version for fast photographic style transfer based on Fast-Neural-Style. The teaser image is a stylized result by the algorithm described in this project, which takes around 1.40 seconds for $852 \times 480$ resolution image on a single NVIDIA 1080Ti card.
In this project, I also provide a torch implementation of the Domain Transform (Recursive Filter) which is described in the paper:
Domain Transform for Edge-Aware Image and Video Processing Eduardo S.