Optimizers

Finally, the optimizers part! In this section, we will define the optimization criteria that will be used during the training process. First off, we are going to update the variables of the generator and discriminator separately, so we need to be able to retrieve the variables of each part.

For the first optimizer, the generator one, we will retrieve all the variables that start with the name generator from the trainable variables of the computational graph; then we can check which variable is which by referring to its name.

We'll do the same for the discriminator variables as well, by letting in all variables that start with discriminator. After that, we can pass the list of variables that we want to be optimized to the optimizer.

So the variable scope feature of TensorFlow gave us the ability to retrieve variables that start with a certain string, and then we can have two different lists of variables, one for the generator and another one for the discriminator:


# building the model optimizer

learning_rate = 0.002

# Getting the trainable_variables of the computational graph, split into Generator and Discrimnator parts
trainable_vars = tf.trainable_variables()
gen_vars = [var for var in trainable_vars if var.name.startswith("generator")]
disc_vars = [var for var in trainable_vars if var.name.startswith("discriminator")]

disc_train_optimizer = tf.train.AdamOptimizer().minimize(disc_loss, var_list=disc_vars)
gen_train_optimizer = tf.train.AdamOptimizer().minimize(gen_loss, var_list=gen_vars)
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset