WGAN project for Machine Learning and Data mining course
- Generative Adversarial Network.
- A network that learns to identify data and generate new data in the same fashion.
- GAN consists of two neural networks: Discriminator and Generator
- The discriminator learns to distinguish the generator's fake data from real data.
- The discriminator penalizes the generator for producing implausible results.
- The generator learns to generate plausible data.
- The generated instances become negative training examples for the discriminator.
- Wasserstein distance or Earth Mover’s(EM) - the minimum energy cost of transforming a pile of dirt from one probability distribution to the other distribution.
- The cost is quantified by the amount of dirt moved.
- Wasserstein metric provides a smooth measure, which is super helpful for a stable learning process using gradient descents.
- In WGAN, instead of using the EMD (the primary problem), we calculate the dual problem, due to high complexity.
- In WGAN loss function is being used for computing gradient.
- In WGAN the model is more stable, and therefore prevents mode collapse.
- Vanishing gradients are less common.
The generator produces data from a small number of classes instead of all the classes.
The implementation learns from MNIST dataset:
This implementation is on MNIST dataset, in the near future We will implement WGAN on a Pokémon images dataset.
WGAN article
EMD duality
WGAN implementation
GAN
More about WGAN
This project is about understanding and research about WGAN
The implementation is based on Jason Brownlee's WGAN implementation
The featured implementation will change by time passes :)





