PyTorch implementation of Vanilla GAN
- For mu = 0.0, sigma = 1.0:
- For mu = 1.0, sigma = 1.5:
- http://blog.aylien.com/introduction-generative-adversarial-networks-code-tensorflow/
- http://blog.evjang.com/2016/06/generative-adversarial-nets-in.html
- https://github.com/hwalsuklee/tensorflow-GAN-1d-gaussian-ex
-
Generator
- hidden layers: Three fully-connected (256, 512, and 1024 nodes, respectively), Leaky ReLU activation
- output layer: Fully-connected (784 nodes), Tanh activation
-
Discriminator
- hidden layers: Three fully-connected (1024, 512, and 256 nodes, respectively), Leaky ReLU activation
- output layer: Fully-connected (1 node), Sigmoid activation
- Dropout: dropout probability = 0.3
- For learning rate = 0.0002 (Adam optimizer), batch size = 128, # of epochs = 100:
GAN losses | Generated images |