ISSN :2582-9793

InkGAN: Generative Adversarial Networks for Ink-and-Wash Style Transfer of Photographs

Original Research (Published On: 30-Jun-2023 )
DOI : https://doi.org/10.54364/AAIML.2023.1171

Wenping Wang, Keyi Yu, Yu Wang, Sihan Zeng, Chen Liang, Xiaoyu Bai and Dachi Chen

Adv. Artif. Intell. Mach. Learn., 3 (2):1220-123

1. Wenping Wang: Individual Researcher

2. Keyi Yu: Google Inc, 1600 Amphitheatre Parkway Mountain View, CA, USA 94043

3. Yu Wang: De Anza College, 21250 Stevens Creek Blvd Cupertino, CA, USA 95014

4. Sihan Zeng: Meta Platforms, 1 Hacker Way Melon Park, CA, USA 94025

5. Chen Liang: Google Inc, 1600 Amphitheatre Parkway Mountain View, CA, USA 94043

6. Xiaoyu Bai: Meta Platforms, 1 Hacker Way Melon Park, CA, USA 94025

7. Dachi Chen: Meta Platforms, 1 Hacker Way Melon Park, CA, USA 94025

Download PDF Here Citation Info via Semantic Scholar

DOI: 10.54364/AAIML.2023.1171

Article History: Received on: 05-May-23, Accepted on: 01-Jul-23, Published on: 30-Jun-23

Corresponding Author: Wenping Wang

Email: wenpingw@alumni.cmu.edu

Citation: Keyi Yu, et al. InkGAN: Generative Adversarial Networks for Ink-and-Wash Style Transfer of Photographs. Advances in Artificial Intelligence and Machine Learning. 2023;3(2):71.


Abstract

    

In this work, we present a novel approach for Chinese Ink-and-Wash style transfer using a GAN structure. The proposed method incorporates a specially designed smooth loss tailored for this style transfer task, and an end-to-end framework that seamlessly integrates various components for efficient and effective image style transferring. To demonstrate the superiority of our approach, comparative results against other popular style transfer methods such as CycleGAN is presented. The experimentation showcased the notable improvements achieved with our proposed method in terms of preserving the intricate details and capturing the essence of the Chinese Ink-and-Wash style. Furthermore, an ablation study is conducted to evaluate the effectiveness of each loss component in our framework. We conclude in the end and anticipate that our findings will inspire further advancements in this domain and foster new avenues for artistic expression in the digital realm.

Statistics

   Article View: 1711
   PDF Downloaded: 14