Table of Contents

Copyright

Brief Table of Contents

Table of Contents

Preface

Acknowledgments

About this book

About the cover illustration

1. Introduction to GANs and generative modeling

Chapter 1. Introduction to GANs

1.1. What are Generative Adversarial Networks?

1.2. How do GANs work?

1.3. GANs in action

1.3.1. GAN training

1.3.2. Reaching equilibrium

1.4. Why study GANs?

Summary

Chapter 2. Intro to generative modeling with autoencoders

2.1. Introduction to generative modeling

2.2. How do autoencoders function on a high level?

2.3. What are autoencoders to GANs?

2.4. What is an autoencoder made of?

2.5. Usage of autoencoders

2.6. Unsupervised learning

2.6.1. New take on an old idea

2.6.2. Generation using an autoencoder

2.6.3. Variational autoencoder

2.7. Code is life

2.8. Why did we try aGAN?

Summary

Chapter 3. Your first GAN: Generating handwritten digits

3.1. Foundations of GANs: Adversarial training

3.1.1. Cost functions

3.1.2. Training process

3.2. The Generator and the Discriminator

3.2.1. Conflicting objectives

3.2.2. Confusion matrix

3.3. GAN training algorithm

3.4. Tutorial: Generating handwritten digits

3.4.1. Importing modules and specifying model input dimensions

3.4.2. Implementing the Generator

3.4.3. Implementing the Discriminator

3.4.4. Building the model

3.4.5. Training

3.4.6. Outputting sample images

3.4.7. Running the model

3.4.8. Inspecting the results

3.5. Conclusion

Summary

Chapter 4. Deep Convolutional GAN

4.1. Convolutional neural networks

4.1.1. Convolutional filters

4.1.2. Parameter sharing

4.1.3. ConvNets visualized

4.2. Brief history of the DCGAN

4.3. Batch normalization

4.3.1. Understanding normalization

4.3.2. Computing batch normalization

4.4. Tutorial: Generating handwritten digits with DCGAN

4.4.1. Importing modules and specifying model input dimensions

4.4.2. Implementing the Generator

4.4.3. Implementing the Discriminator

4.4.4. Building and running the DCGAN

4.4.5. Model output

4.5. Conclusion

Summary

2. Advanced topics in GANs

Chapter 5. Training and common challenges: GANing for success

5.1. Evaluation

5.1.1. Evaluation framework

5.1.2. Inception score

5.1.3. Fréchet inception distance

5.2. Training challenges

5.2.1. Adding network depth

5.2.2. Game setups

5.2.3. Min-Max GAN

5.2.4. Non-Saturating GAN

5.2.5. When to stop training

5.2.6. Wasserstein GAN

5.3. Summary of game setups

5.4. Training hacks

5.4.1. Normalizations of inputs

5.4.2. Batch normalization

5.4.3. Gradient penalties

5.4.4. Train the Discriminator more

5.4.5. Avoid sparse gradients

5.4.6. Soft and noisy labels

Summary

Chapter 6. Progressing with GANs

6.1. Latent space interpolation

6.2. They grow up so fast

6.2.1. Progressive growing and smoothing of higher-resolution layers

6.2.2. Example implementation

6.2.3. Mini-batch standard deviation

6.2.4. Equalized learning rate

6.2.5. Pixel-wise feature normalization in the generator

6.3. Summary of key innovations

6.4. TensorFlow Hub and hands-on

6.5. Practical applications

Summary

Chapter 7. Semi-Supervised GAN

7.1. Introducing the Semi-Supervised GAN

7.1.1. What is a Semi-Supervised GAN?

7.1.2. Architecture

7.1.3. Training process

7.1.4. Training objective

7.2. Tutorial: Implementing a Semi-Supervised GAN

7.2.1. Architecture diagram

7.2.2. Implementation

7.2.3. Setup

7.2.4. The dataset

7.2.5. The Generator

7.2.6. The Discriminator

7.2.7. Building the model

7.2.8. Training

7.3. Comparison to a fully supervised classifier

7.4. Conclusion

Summary

Chapter 8. Conditional GAN

8.1. Motivation

8.2. What is Conditional GAN?

8.2.1. CGAN Generator

8.2.2. CGAN Discriminator

8.2.3. Summary table

8.2.4. Architecture diagram

8.3. Tutorial: Implementing a Conditional GAN

8.3.1. Implementation

8.3.2. Setup

8.3.3. CGAN Generator

8.3.4. CGAN Discriminator

8.3.5. Building the model

8.3.6. Training

8.3.7. Outputting sample images

8.3.8. Training the model

8.3.9. Inspecting the output: Targeted data generation

8.4. Conclusion

Summary

Chapter 9. CycleGAN

9.1. Image-to-image translation

9.2. Cycle-consistency loss: There and back aGAN

9.3. Adversarial loss

9.4. Identity loss

9.5. Architecture

9.5.1. CycleGAN architecture: building the network

9.5.2. Generator architecture

9.5.3. Discriminator architecture

9.6. Object-oriented design of GANs

9.7. Tutorial: CycleGAN

9.7.1. Building the network

9.7.2. Building the Generator

9.7.3. Building the Discriminator

9.7.4. Training the CycleGAN

9.7.5. Running CycleGAN

9.8. Expansions, augmentations, and applications

9.8.1. Augmented CycleGAN

9.8.2. Applications

Summary

3. Where to go from here

Chapter 10. Adversarial examples

10.1. Context of adversarial examples

10.2. Lies, damned lies, and distributions

10.3. Use and abuse of training

10.4. Signal and the noise

10.5. Not all hope is lost

10.6. Adversaries to GANs

10.7. Conclusion

Summary

Chapter 11. Practical applications of GANs

11.1. GANs in medicine

11.1.1. Using GANs to improve diagnostic accuracy

11.1.2. Methodology

11.1.3. Results

11.2. GANs in fashion

11.2.1. Using GANs to design fashion

11.2.2. Methodology

11.2.3. Creating new items matching individual preferences

11.2.4. Adjusting existing items to better match individual preferences

11.3. Conclusion

Summary

Chapter 12. Looking ahead

12.1. Ethics

12.2. GAN innovations

12.2.1. Relativistic GAN

12.2.2. Self-Attention GAN

12.2.3. BigGAN

12.3. Further reading

12.4. Looking back and closing thoughts

Summary

 Training Generative Adversarial Networks (GANs)

Index

List of Figures

List of Tables

List of Listings

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset