Recreating Traditional Indonesian Batik with Neural Style Transfer in AI Artistry

Michael Joseph; Jeconiah Richard; Calvin S. Halim; Rowin Faadhilah; Nunung N. Qomariyah

Style transfer is a method of combining two images into one, taking reference of one of the image’s styles and the other image’s content. Convolutional neural networks have been applied to this method to produce what is known as neural style transfer. Using VGG-Network, the artificial system was able to recreate artistic images by combining different content and style. However research regarding the effects of different models and optimization to the quality of the image produced are limited. The aim of this experiment is to compare between the VGG19 and VGG16. We use data and results from Leon A. Gatys to compare between two different models which are VGG19, and VGG16 in terms of content loss and style loss. This architecture is also applied to the more focal style of Batik in this to experiment the effects of a dominant color and pattern on another image. With Indonesia’s rich culture and its diverse art portfolio, it is only natural that this paper explore neural style transfer’s effects on the creative pattern forming of Batik.