Introducing TensorBoard

Keeping track of how variables change during the training of a model can be a tedious job. For instance, in the linear regression example, we kept track of the MSE loss and of the parameters of the model by printing them every 40 epochs. As the complexity of the algorithms increases, there is an increase in the number of variables and metrics to be monitored. Fortunately, this is where TensorBoard comes to the rescue.

TensorBoard is a suite of visualization tools that can be used to plot metrics, visualize TensorFlow graphs, and visualize additional information. A typical TensorBoard screen is similar to the one shown in the following screenshot:

Figure 2.6: Scalar TensorBoard page

The integration of TensorBoard with TensorFlow code is pretty straightforward as it involves only a few tweaks to the code. In particular, to visualize the MSE loss over time and monitor the weight and bias of our linear regression model using TensorBoard, it is first necessary to attach the loss tensor to tf.summar.scalar() and the model's parameters to tf.summary.histogram()The following snippet should be added after the call to the optimizer:

tf.summary.scalar('MSEloss', loss)
tf.summary.histogram('model_weight', v_weight)
tf.summary.histogram('model_bias', v_bias)

Then, to simplify the process and handle them as a single summary, we can merge them:

all_summary = tf.summary.merge_all()

At this point, we have to instantiate a FileWriter instance that will log all the summary information in a file:

now = datetime.now()
clock_time = "{}_{}.{}.{}".format(now.day, now.hour, now.minute, now.second)
file_writer = tf.summary.FileWriter('log_dir/'+clock_time, tf.get_default_graph())

The first two lines create a unique filename using the current date and time. In the third line, the path of the file and the TensorFlow graph are passed tFileWriter(). The second parameter is optional and represents the graph to visualize.

The final change is done in the training loop by replacing the previous line, train_loss, _ = session.run(..), with the following:

train_loss, _, train_summary = session.run([loss, opt, all_summary], feed_dict={x_ph:X, y_ph:y})
file_writer.add_summary(train_summary, ep)

First, all_summary is executed in the current session, and then the result is added to file_writer to be saved in the file. This procedure will run the three summaries that were merged previously and log them in the log file. TensorBoard will then read from this file and visualize the scalar, the two histograms, and the computation graph.

Remember to close file_writer at the end, as follows:

file_writer.close()

Finally, we can open TensorBoard by going to the working directory and typing the following in a terminal:

$ tensorboard --logdir=log_dir

This command creates a web server that listens to port 6006. To start TensorBoard, you have to go to the link that TensorBoard shows you:

Figure 2.7: Histogram of the linear regression model's parameters

You can now browse TensorBoard by clicking on the tabs at the top of the page to access the plots, the histograms, and the graph. In the preceding—as well as the following—screenshots, you can see some of the results visualized on those pages. The plots and the graphs are interactive, so take some time to explore them in order to improve your understanding of their use. Also check the TensorBoard official documentation (https://www.tensorflow.org/guide/summaries_and_tensorboard) to learn more about the additional features included in TensorBoard:

Figure 2.8: Scalar plot of the MSE loss
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset