Training an agent

For much of this book, we have spent our time looking at code and the inner depths of deep learning (DL) and reinforcement learning (RL). With that knowledge established, we can now jump in and look at examples where deep reinforcement learning (DRL) is put to use. Fortunately, the new agent's toolkit provides several examples to demonstrate the power of the engine. Open up Unity or the Unity Hub and follow these steps:

  1. Click the Open project button at the top of the Project dialog.
  2. Locate and open the UnitySDK project folder as shown in the following screenshot:
Opening the UnitySDK project
  1. Wait for the project to load and then open the Project window at the bottom of the editor. If you are asked to update the project, just be sure to say yes or continue. Thus far, all of the agent code has been designed to be backward compatible.

  1. Locate and open the GridWorld scene as shown in this screenshot:
Opening the GridWorld example scene
  1. Select the GridAcademy object in the Hierarchy window. 
  1. Then direct your attention to the Inspector window, and beside the Brains, click the target icon to open the Brain selection dialog:
Inspecting the GridWorld example environment
  1. Select the GridWorldPlayer brain. This brain is a player brain, meaning that a player, you, can control the game. We will look at this brain concept more in the next section.
  2. Press the Play button at the top of the editor and watch the grid environment form. Since the game is currently set to a player, you can use the WASD controls to move the cube. The goal is much like the FrozenPond environment we built a DQN for earlier. That is, you have to move the blue cube to the green + symbol and avoid the red X.

Feel free to play the game as much as you like. Note how the game only runs for a certain amount of time and is not turn-based. In the next section, we will learn how to run this example with a DRL agent.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset