The search code will be published once the paper is accepted, and the training code and network weights will be published immediately.
Code used for "Time-constrained Evolutionary Neural Architecture Search for Generative Adversarial Networks".
We've desinged a evolutionary neural architecture search algorithm for generative adversarial networks (GANs), dubbed T-EAGAN. Experiments validate the effectiveness of T-EAGAN on the task of unconditional image generation. Extensive experiments on the CIFAR-10 and STL-10 datasets demonstrated that T-EAGAN only requires 1.08 GPU days to find out a superior GAN architecture in a search space including approximately 1015 network architectures. Our best found network architectures could outperform those obtained by other neural architecture search methods with the performance metric results(IS=8.957±0.08, FID=9.432) on CIFAR-10 and (IS=10.576±0.085, FID=20.323) on STL-10.
Fig:framework for T-EAGAN
picture1
picture2
The search environment is consistent with AlphaGAN,to run this code, you need:
- PyTorch 1.3.0
- TensorFlow 1.15.0
- cuda 10.0
Other requirements are in environment.yaml
conda env create -f environment.yaml
you need to create "fid_stat" directory and download the statistical files of real images.
mkdir fid_stat
bash EAGAN_Only_G30.sh
bash ./scripts/train_arch_cifar10.sh
bash ./scripts/train_arch_stl10.sh
bash ./scripts/test_arch_cifar10.sh
bash ./scripts/test_arch_stl10.sh
Some of the codes are built by:
1.EAGAN
2.AlphaGAN
3.Inception Score code from OpenAI's Improved GAN (official).
4.FID Score code and CIFAR-10 statistics file from (official).
Thanks them for their great works!