Implementation of FastGAN from https://arxiv.org/abs/2101.04775.
Few-shot images dataset from default configuration is automatically downloaded from http://silentz.ml/few-shot-images.zip. In case domain has expired, you can download archive manually and unpack it to data/
directory inside project root. NOTE: you can download few-shot-images.zip from github releases of this repository. Final data/
directory layout should look like this:
data/
├── few-shot-images
│ ├── anime
│ ├── art
│ ├── cat_faces
│ ├── dog_faces
│ ├── grumpy_cat
│ ├── moongate
│ ├── obama
│ ├── panda
│ ├── pokemon
│ ├── shells
│ └── skulls
└── few-shot-images.zip
gen_loss
- generator loss
disc_real_loss
- discriminator loss on real images
disc_fake_loss
- discriminator loss on images from generator
- Clone repository:
git clone https://github.com/silentz/Towards-Faster-And-Stabilized-GAN-Training-For-High-Fidelity-Few-Shot-Image-Synthesis.git
- Cd into repository root:
cd Towards-Faster-And-Stabilized-GAN-Training-For-High-Fidelity-Few-Shot-Image-Synthesis
- Create and activate virtualenv:
virtualenv --python=python3 venv
source venv/bin/activate
- Install required packages:
pip install -r requirements.txt
-
Download dataset (should be automatic, but if not, see section above).
-
Choose one of dataset configs (located in
train/configs
dir):
train/configs/
├── anime.yaml
├── art.yaml
├── cat_faces.yaml
├── dog_faces.yaml
├── grumpy_cat.yaml
├── moongate.yaml
├── obama.yaml
├── panda.yaml
├── pokemon.yaml
├── shells.yaml
└── skulls.yaml
- Train model:
python -m train fit --config train/configs/shells.yaml
- Create torchscript model:
python -m train.export --config train/configs/shells.yaml --from_ckpt checkpoints/epoch=0-step=49999.ckpt
- Run inference script:
python infer.py export/shells.pt # path to exported model (see config)
Trained FastGAN torchscript model is located in second release (see "releases" section of github page of repository). Here are samples of generated images (model is trained on shells dataset):