I would like to sincerely thank the authors of CycleGAN for using their open-sourced source code in PyTorch. The code has been fully borrowed from the author's code. Example dataset has been added. Citing here instead of citing in every file.
The CycleGAN code was written by Jun-Yan Zhu and Taesung Park(https://ssnl.github.io/).
Kindly refer, original CycleGAN page for installing packages.
Steps: Clone the repository and save it in your Google Drive as pytorch-CycleGAN.
Download preferred datasets using the following command in Google Collaboratory !bash ./datasets/download_cyclegan_dataset.sh - horse2zebra, maps (for aerial photo to Google Maps), vangogh2photo
The images will be downloaded to the folder in Google Drive--> datasets-->
You can also use the example dataset provided in the datasets folder.
Create a folder by the name capstone_trained_model in pytorch-CycleGAN to save your experiment's results.
Command to mount Google Drive to Colab:
from google.colab import drive
drive.mount('/content/gdrive')
Command to change the directory:
cd gdrive/My Drive/pytorch-CycleGAN
To train the model,
!python train.py --dataroot ./datasets/<example dataset> --name <experiment name> --direction AtoB --checkpoints_dir capstone_trained_model --init_type kaiming --save_epoch_freq 10
To test the model,
!python test.py --dataroot ./datasets/<example dataset> --model cycle_gan --name <experiment name> --direction AtoB --checkpoints_dir capstone_trained_model --num_test 2000
Once you have trained the model, you can see the log loss file in the folder capstone_trained_model--> of your Google Drive. You can analyse the loss data and can even plot it to check for convergence. The test results are saved in pytorch-CycleGAN--> results folder
@inproceedings{CycleGAN2017,
title={Unpaired Image-to-Image Translation using Cycle-Consistent Adversarial Networkss},
author={Zhu, Jun-Yan and Park, Taesung and Isola, Phillip and Efros, Alexei A},
booktitle={Computer Vision (ICCV), 2017 IEEE International Conference on},
year={2017}
}