Paper Title
Context based Video Stylization
Abstract
This paper describes the approach for using context based stylization by understanding the environment present in the content image along with some of the features present in the image to predict the styles that will provide better stylized content. The paper first discusses the limitations of the current techniques and then defines the methodology used by us to generate styles based on the input content image. Then it discusses about applying the style to the video such that the part of the video with similar environment as that to the given content frame are only styled with the obtained style using our methodology. The paper defines a context based partition algorithm technique used to separate video into contextually similar parts. The paper also discusses the performance of our proposed methodology and the results obtained thereof.
Keywords - Video Stylization; Contextual Video Partitioning; Arbitrary Neural Style Transfer