A PyTorch implementation of CycleGAN and applying it to font style transfer.
I have personally used the Anaconda distribution for downloading all required packages and creating my virtual environment for this. You can use pip
for your purposes. For a proper execution of my code, Python version 3.7
is needed.
To mimic my setup steps, do the following:
- Install Python 3.7
- Install Anaconda by following the instructions here
- Create a conda virtual environment
conda create --name {your_env_name}
- Activate the environment
conda activate {your_env_name}
- Install following packages:
conda install numpy
conda install pandas
conda install scikit-learn
conda install matplotlib
conda install seaborn
conda install pytorch torchvision -c pytorch
conda install jupyter
apple2orange, summer2winter_yosemite, horse2zebra, monet2photo, cezanne2photo, ukiyoe2photo, vangogh2photo, maps, cityscapes, facades, iphone2dslr_flower, ae_photos
- Clone this repository and cd to the root directory of the cloned repository
- To download the datasets used in the original paper, execute
download.sh
with parameter which should be one ofapple2orange, summer2winter_yosemite, horse2zebra, monet2photo, cezanne2photo, ukiyoe2photo, vangogh2photo, maps, cityscapes, facades, iphone2dslr_flower, ae_photos
. Note that for my experimentation I have only usedapple2orange
andsummer2winter_yosemite
. - The data for font style transfer that I used is already included in the report under
datasets/arial2times
anddatasets/arial2times_word
. - There are a couple of Jupyter Notebook file
cycle-gan.ipynb
andimage_generator.ipynb
that were used to do some local experimentation and font image generation, but those are not needed for the operational purposes of this repository. The main operational files aretrain.py
,test.py
,utils.py
andmodel.py
.
Once you have set up the virtual env, cloned the reposiotry and downloaded the data, you can now run training:
# apple2orange
python train.py --dataset apple2orange --epochs 50 --constant_lr_epochs 25 --lr 0.0004 --cycle_loss_lambda 5
# summer2winter_yosemite
python train.py --dataset summer2winter_yosemite --epochs 50 --constant_lr_epochs 25 --lr 0.0002 --cycle_loss_lambda 10 --identity_loss_lambda 5
# arial2times_word
python train.py --dataset arial2times_word --epochs 50 --constant_lr_epochs 25 --lr 0.0002 --cycle_loss_lambda 10 --identity_loss_lambda 5
Please note the full list of parameters in train.py
python test.py --dataset apple2orange
python test.py --dataset summer2winter_yosemite
python test.py --dataset arial2times_word
Sample results below:
Arial to Times to Arial (word)
Times to Arial to Times (word)
Loss trends over epochs (for Arial to Times to Arial (word)):