The generator wants to fool the discriminator, in other words, make the discriminator output 1 for a generated image G(z) . The generator loss is just the negative of the binomial cross entropy loss applied to the discriminator output of a result from the generator. Note that as the generator is always trying to generate "real" images, the cross entropy loss simplifies down to this:
Here, each term means as follows:
- m: Batch size
- D: Discriminator
- G: Generator
- z: Random noise vector
We want to maximize this loss function when training our GAN. When the loss is maximized, it means the generator is capable of generating images that can fool the discriminator, and the discriminator is outputting 1 for generated images.