Chapter 22: Augmented Reality in Unity

Nowadays, new technologies expand the fields of application of Unity, from gaming to all kinds of software, such as simulations, training, apps, and so on. In the latest versions of Unity, we saw lots of improvements in the field of augmented reality (AR), which allows us to add a layer of virtuality on top of our reality, thereby augmenting what our device can perceive to create games that rely on real-world data, such as the camera's image, our real-world position, and the current weather. This can also be applied to work environments, such as when viewing the building map or checking the electrical ducts inside a wall. Welcome to the extra section of this book, where we are going to discuss how to create AR applications using Unity's AR Foundation package.

In this chapter, we will examine the following AR Foundation concepts:

  • Using AR Foundation
  • Building for mobile devices
  • Creating a simple AR game

By the end of this chapter, you will be able to create AR apps using AR Foundation, and will have a fully functional game that uses its framework so that you can test the framework's capabilities.

Let's start by exploring the AR Foundation framework.

Using AR Foundation

When it comes to AR, Unity has two main tools to create applications: Vuforia and AR Foundation. Vuforia is an AR framework that can work on almost any phone and contains all the needed features for basic AR apps, but with a paid subscription for more advanced features. On the other hand, the completely free AR Foundation framework supports the latest AR native features of our devices but is supported only by newer devices. Picking between one or the other depends a lot on the type of project you're going to build and the target audience. However, since this book aims to discuss the latest Unity features, we are going to explore how to use AR Foundation to create our first AR app for detecting the positions of images and surfaces in the real world. So, we'll start by exploring its API.

In this section, we will examine the following AR Foundation concepts:

  • Creating an AR Foundation project
  • Using tracking features

Let's start by discussing how to prepare our project so that it can run AR Foundation apps.

Creating an AR Foundation project

Something to consider when creating AR projects is that we will not only change the way we code our game, but also the game design aspect. AR apps have differences, especially in the way the user interacts, and also limitations, such as the user being in control of the camera all the time. We cannot simply port an existing game to AR without changing the very core experience of the game. That's why, in this chapter, we are going to work on a brand new project; it would be too difficult to change the game we've created so far so that it works well in AR.

In our case, we are going to create a game where the user controls a player moving a "marker", a physical image you can print that will allow our app to recognize where the player is in the real world. We will be able to move the player while moving that image, and this virtual player will automatically shoot at the nearest Enemy. Those enemies will spawn from certain spawn points that the user will need to place in different parts of the home. As an example, we can put two spawn points on the walls and place our player marker in a table in the middle of the room so that the enemies will go toward them. In the following image, you can see a preview of what the game will look like:

Figure 22.1 – Finished game. The Cylinder is an Enemy Spawner, the Capsule is the Enemy, and the Cube is the Player. These are positioned in a marker image displayed by the cellphone

Figure 22.1 – Finished game. The Cylinder is an Enemy Spawner, the Capsule is the Enemy, and the Cube is the Player. These are positioned in a marker image displayed by the cellphone

We'll start creating a new URP-based project in the same manner we created our first game. Something to consider is that AR Foundation works with other pipelines, including built-in ones, in case you want to use it in already existing projects. If you don't remember how to create a project, please refer to Chapter 2, Setting Up Unity. Once you're in your new blank project, install the AR Foundation package from the Package Manager, just like we've installed other packages previously; that is, from Window | Package Manager. Remember to set the Package Manager so that it shows all packages, not only the ones in the project (the Packages button at the top-left part of the window needs to be set to Unity Registry). At the time of writing this book, the latest version is 4.0.2. Remember you can use the See other Versions button that appears clicking the triangle at the left of the package under Package Item in the list to display other version options. If you find a newer version than mine, you can try using that one, but as usual, if something works differently to what we want, please install this specific version:

Figure 22.2 – Installing AR Foundation

Figure 22.2 – Installing AR Foundation

Before we install any other needed packages, now is a good moment to discuss some core ideas of the AR Foundation framework. This package, by itself, does nothing; it defines a series of AR features that mobile devices offer, such as image tracking, cloud points, and object tracking, but the actual implementation of how to do that is contained in the Provider packages, such as AR Kit and AR Core XR plugins. This is designed like this because, depending on the target device you want to work with, the way those features are implemented changes. As an example, in iOS, Unity implements those features using AR Kit, while in Android, it uses AR Core; they are platform-specific frameworks.

Something to consider here is that not all iOS or Android devices support AR Foundation apps. You might find an updated list of supported devices when searching for AR Core and AR Kit supported devices on the internet. At the time of writing, the following links provide the supported devices lists:

Also, there isn't a PC Provider package, so the only way to test AR Foundation apps so far is directly on the device, but testing tools are going to be released soon. In my case, I will be creating an app for iOS, so aside from the AR Foundation package, I need to install the ARKit XR plugin. However, if you want to develop for Android, install the ARCore XR plugin instead (or both if you're targeting both platforms). I will be using the 4.0.2 version of the ARKit package, but at the moment of writing this book, the ARCore recommended version is 4.0.4 Usually, the versions of the AR Foundation and Provider packages match, but apply the same logic as when you picked the AR Foundation version. In the following screenshot, you can see the ARKit package in the Package Manager:

Figure 22.3 – Installing the platform-specific AR provider package

Figure 22.3 – Installing the platform-specific AR provider package

Now that we have the needed plugins, we need to prepare a scene for AR, as follows:

  1. Create a new Scene in File | New Scene.
  2. Delete Main Camera; we are going to use a different one.
  3. In the GameObject | XR menu, create an AR Session Object.
  4. In the same menu, create an AR Session Origin Object that has a Camera inside it:
    Figure 22.4 – Creating the Session objects

    Figure 22.4 – Creating the Session objects

  5. Your hierarchy should look as follows:
Figure 22.5 – Starter ARSCcene

Figure 22.5 – Starter ARSCcene

The AR Session object will be responsible for initializing AR Framework and will handle all the update logic for the AR systems. The AR Session Origin object will allow the framework to locate tracked objects such as images and point clouds in a relative position to the scene. The devices inform the positions of tracked objects relative to what the device considers "the origin". This is usually the first area of your house you were pointing at when the app started detecting objects, so the AR Session Origin object will represent that area. Finally, you can check the camera inside the origin, which contains some extra components, with the most important being AR Pose Driver, which will make your Camera object move along with your device. Since the device's position is relative to the Session Origin object's point, the camera needs to be inside the origin object.

One extra step in case you are working in a URP project (our case) is that you need to set up the render pipeline so that it supports rendering the camera image in the app. To do that, go to the Settings folder that was generated when we created the project, look for the Forward Renderer file, and select it. In the Renderer Features list, click the Add renderer feature button and select AR Background Renderer Feature. Consider that this option might be unavailable if you are working on versions older than 4.0.0 of the AR Foundation and Provider plugins. In the following screenshot, you can see what the Forward Renderer asset should look like:

Figure 22.6 – Adding support for URP

Figure 22.6 – Adding support for URP

And that's all! We are ready to start exploring the AR Foundation components so that we can implement tracking features.

Using tracking features

For our project, we are going to need two of the most common tracking features in AR (but not the only ones): image recognition and plane detection. The first one consists of detecting the position in the real world of a specific image so that we can place digital objects on top of it, such as the player. The second one, plane detection, consists of recognizing real-life surfaces, such as floors, tables, and walls, so that we have a reference of where we can put objects such as the enemies' spawn points. Only horizontal and vertical surfaces are recognized (just vertical surfaces on some devices).

The first thing we need to do is tell our app which images it needs to detect, as follows:

  1. Add an image to the project that you can print or display in a cellphone. Having a way to display the image in the real world is necessary to test this. In this case, I will use the following image:
    Figure 22.7 – Image to track

    Figure 22.7 – Image to track

    Important Note

    Try to get an image that contains as many features as you can. This means an image with lots of little details, such as contrasts, sharp corners, and so on. Those are what our AR systems use to detect it; the more detail, the better the recognition. In our case, the Unity logo we are using doesn't actually have too many details, but there's enough contrast (just black and white) and sharp corners for the system to recognize it. If your device has trouble detecting it, try other images (the classic QR code might help).

    Consider that some devices might have trouble with certain images, such as the image suggested in this book. If this generates issues when testing, please try using another one. You will be testing this on your device in the upcoming sections of this chapter, so just keep this in mind.

  2. Create a Reference Image Library, an asset containing all the images we wish our app to recognize, by clicking the + button in Project Panel and selecting XR | Reference Image Library:
    Figure 22.8 – Creating a Reference Image Library

    Figure 22.8 – Creating a Reference Image Library

  3. Select the library asset and click the Add Image button to add a new image to the library.
  4. Drag the texture to the texture slot (the one that says None).
  5. Turn Specify Size on and set Physical Size to the size that your image will be in real life, in meters. Try to be accurate here; on some devices, not having this value right might result in the image not being tracked:
Figure 22.9 – Adding an image to be recognized

Figure 22.9 – Adding an image to be recognized

Now that we've specified the images to be detected, let's test this by placing a cube on top of the real-life image:

  1. Create a prefab of a cube and add the AR Tracked Image component to it.
  2. Add the AR Tracked Image Manager component to the AR Session Origin object. This will be responsible for detecting images and creating objects in its position.
  3. Drag the Image Library asset to the Serialized Library property of the component to specify the images to recognize.
  4. Drag the Cube prefab to the Tracked Image Prefab property of the component:
Figure 22.10 – Setting up the Tracked Image Manager

Figure 22.10 – Setting up the Tracked Image Manager

And that's all! We will see a cube spawning in the same position the image is located at in the real world. Remember that you need to test this in the device, which we will do in the next section, so for now, let's keep coding our test app:

Figure 22.11 – Cube located on top of the image being displayed by the cellphone

Figure 22.11 – Cube located on top of the image being displayed by the cellphone

Let's also prepare our app so that it can detect and display the plane surfaces the camera has recognized. This is simply done by adding the AR Plane Manager component to the AR Session Origin object:

Figure 22.12 – Adding the AR Plane Manager component

Figure 22.12 – Adding the AR Plane Manager component

This component will detect surface planes over our house as we move the camera over it. It can take a while to detect them, so it's important to visualize the detected areas to get feedback about this to ensure it's working properly. We can manually get information about the plane from a component reference to the AR Plane Manager, but luckily, Unity allows us to visualize planes easily. Let's take a look:

  1. Create a prefab of a plane, first by creating the plane in GameObject | 3D Object | Plane.
  2. Add a Line Renderer to it. This will allow us to draw a line over the edges of the detected areas.
  3. Set the Width property of Line Renderer to a small value such as 0.01, the Color property to black, and uncheck Use World Space:
    Figure 22.13 – Setting the Line Renderer

    Figure 22.13 – Setting the Line Renderer

  4. Remember to create a material for Line Renderer with the proper shader and set it as the material of the renderer:
    Figure 22.14 – Creating the Line Renderer Material

    Figure 22.14 – Creating the Line Renderer Material

  5. Also, create a transparent material and use it in the MeshRenderer plane. We want to see through it so that we can easily see the real surface beneath:
    Figure 22.15 – Material for the detected plane

    Figure 22.15 – Material for the detected plane

  6. Add the AR Plane and AR Plane Mesh Visualizer components to the Plane prefab.
  7. Drag the prefab to the Plane Prefab property of the AR Plane Manager component of the AR Session Origin object:
Figure 22.16 – Setting the plane visualization prefab

Figure 22.16 – Setting the plane visualization prefab

Now, we have a way to see the planes, but seeing them is not the only thing we can do (sometimes, we don't even want them to be visible). The real power of planes resides on placing virtual objects on top of real-life surfaces, tapping in a specific plane area, and getting its real-life position. We can access the plane data using the AR Plane Manager or by accessing the AR Plane component of our visualization planes, but something easier is to use the AR Raycast Manager component.

The AR Raycast Manager component provides us with the equivalent to the Physics.Raycast function of the Unity Physics system, which, as you may recall, is used to create imaginary rays that start from one position and go toward a specified direction in order to make them hit surfaces and detect the exact hit point. The version provided by AR Raycast Manager, instead of colliding with Physics Colliders, collides with tracked objects, mostly Point Clouds (we are not using them) and the "Planes" we are tracking. We can test this feature by following these steps:

  1. Add the AR Raycast Manager component to the AR Session Origin object.
  2. Create a custom script called InstanceOnPlane in the AR Session Origin object.
  3. In the Awake cache, add the reference to ARRaycastManager. You will need to add the using UnityEngine.XR.ARFoundation; line to the top of the script for this class to be usable in our script.
  4. Create a private field of the List<ARRaycastHit> type and instantiate it; the Raycast is going to detect every plane our ray hit, not just the first one:
    Figure 22.17 – List to store hits

    Figure 22.17 – List to store hits

  5. Under Update, check if the Left Mouse Button (KeyCode.Mouse0) is being pressed. In AR apps, the mouse is emulated with the device's touch screen (you can also use the Input.touches array for multi-touch support).
  6. Inside the if statement, add another condition for calling the Raycast function of AR Raycast Manager, passing the position of the mouse as the first parameter and the list of hits as the second.
  7. This will throw a raycast toward the direction the player touches the screen and store the hits inside the list we provided. This will return true if something has been hit, and false if not:
    Figure 22.18 – Throwing AR raycasts

    Figure 22.18 – Throwing AR raycasts

  8. Add a public field to specify the prefab to instantiate in the place we touched. You can just create a Sphere prefab to test this; there's no need to add any special component to the prefab here.
  9. Instantiate the prefab in the Position and Rotation fields of the Pose property of the first hit stored in the list. The hits are sorted by distance, so the first hit is the closest one. Your final script should look as follows:
Figure 22.19 – Raycaster component

Figure 22.19 – Raycaster component

In this section, we learned how to create a new AR project using AR Foundation. We discussed how to install and set up the framework, as well as how to detect real-life image positions and surfaces, and then how to place objects on top of them.

As you may have noticed, we never hit play to test this, and sadly at the time of writing this book, we cannot test this in the Editor. Instead, we need to test this directly on the device. Due to this, in the next section, we are going to learn how to do builds for mobile devices such as Android and iOS.

Building for mobile devices

Unity is a very powerful tool that solves the most common problems in game development very easily, and one of them is building the game for several target platforms. Now, the Unity part of building our project for such devices is easy to do, but each device has its non-Unity-related nuances for installing development builds. In order to test our AR app, we need to test it directly in the device. So, let's explore how we can make our app run on Android and iOS, the most common mobile platforms.

Before diving into this topic, it is worth mentioning that the following procedures change a lot over time, so you will need to find the latest instructions on the internet. The Unity Learn portal site (https://learn.unity.com/tutorial/building-for-mobile) may be a good alternative in case the instructions in this book fail, but try the steps here first.

In this section, we will examine the following mobile building concepts:

  • Building for Android
  • Building for iOS

Let's start by discussing how to build our app so that it runs on Android phones.

Building for Android

Creating Android builds is relatively easy compared to other platforms, so we'll start with Android. Remember that you will need an Android device capable of running AR Foundation apps, so please refer to the link regarding Android supported devices we mentioned in the first section of this chapter. The first thing we need to do is check if we have installed Unity's Android support and configured our project to use that platform. To do that, follow these steps:

  1. Close Unity and open Unity Hub.
  2. Go to the Installs section and locate the Unity version you are working on.
  3. Click the three dots button at the top-right corner of the Unity version and click Add Modules:
    Figure 22.20 – Adding modules to the Unity version

    Figure 22.20 – Adding modules to the Unity version

  4. Make sure Android Build Support and the sub-options that are displayed when you click the arrow on its left are checked. If not, check them and click the Done button at the bottom-right to install them:
    Figure 22.21 – Adding Android support to Unity

    Figure 22.21 – Adding Android support to Unity

  5. Open the AR project we created in this chapter.
  6. Go to Build Settings (File | Build Settings).
  7. Select the Android platform from the list and click the Switch Platform button at the bottom-right part of the window:
Figure 22.22 – Switching to Android builds

Figure 22.22 – Switching to Android builds

To build an app on Android, there are some requirements we need to meet, such as having the Java SDK (not the regular Java runtime) and Android SDK installed, but luckily, the new versions of Unity take care of that. Just to double-check that we have installed the needed dependencies, follow these steps:

  1. Go to Unity Preferences (Edit | Preferences on Windows, Unity | Preferences on Mac).
  2. Click External Tools.
  3. Check that all the options that say …Installed with Unity on the Android section are checked. This means we will be using all the dependencies installed by Unity:
Figure 22.23 – Using installed dependencies

Figure 22.23 – Using installed dependencies

There are some additional Android AR Core-specific related settings to check that you can find at https://developers.google.com/ar/develop/unity-arf/quickstart-android. These can change if you are using newer versions of AR Core. You can apply them by following these steps:

  1. Go to Player Settings (Edit | Project Settings | Player).
  2. Uncheck Multithreaded Rendering and Auto Graphics API.
  3. Remove Vulkan from the Graphics APIs list.
  4. Set Minimum API Level to Android 7.0:
Figure 22.24 – AR Core settings

Figure 22.24 – AR Core settings

Now, you can finally build the app from File | Build Settings like usual, by using the Build button. This time, the output will be a single APK file that you can install by copying the file to your device and opening it. Remember that in order to install APKs that weren't downloaded from the Play Store, you need to set your device to allow Install Unknown Apps. The location for that option varies a lot, depending on the Android version and the device you are using, but this option is usually located in the Security settings. Some Android versions prompt you to view these settings when installing the APK.

Now, we can copy and install the generated APK build file every time we want to create a build. However, we can let Unity do that for us using the Build and Run button. This option, after building the app, will look for the first Android device connected to your computer via USB and will automatically install the app. For this to work, we need to prepare our device and PC, as follows:

On your device, find the build number in the Settings section of the device, whose location, again, can change depending on the device. On my device, it is located in the About Phone | Software Information section:

Figure 22.25 – Locating the build number

Figure 22.25 – Locating the build number

  1. Tap it a few times until the device says you are now a programmer. This procedure enables the hidden developer option in the device, which you can now find in the settings.
  2. Open the developer options and turn on USB Debugging, which allows your PC to have special permissions on your device. In this case, it allows you to install apps.
  3. Install the USB drivers from your phone manufacturer's site onto your computer. For example, if you have a Samsung device, search for Samsung USB Driver. Also, if you can't find that, you can look for Android USB Driver to get the generic drivers, but that might not work if your device manufacturer has their own. On Mac, this step is usually not necessary.
  4. Connect your device (or reconnect it if it's already connected). The option to Allow USB Debugging for your computer will appear on the device. Check Always Allow and click OK:
    Figure 22.26 – Allowing USB debugging

    Figure 22.26 – Allowing USB debugging

  5. Accept the Allow Data prompt that appears.
  6. If these options don't appear, check that the USB Mode of your device is set to Debugging and not any other.
  7. In Unity, build with the Build and Run button.
  8. Please remember to try another image if you have trouble detecting the image where we instantiate the player (the Unity logo, in my case). This might vary a lot, according to your device's capabilities.

And that's all! Now that you have your app running on your device, let's learn how to do the same for the iOS platform.

Building for iOS

When developing on iOS, you will need to spend some money. You will need to run Xcode, a piece of software you can only run on OS X. Due to this, you'll need a device that can run it, such as a MacBook, a Mac mini, and so on. There may be ways to run OS X on PCs, but you will need to find this out and try it for yourself. Besides spending on a Mac and on an iOS device (iPhone, iPad, iPod, and so on), you'll need to pay for an Apple Developer account, which costs 99 USD per year, even if you are not planning to release the application on the App Store (there may be alternatives, but, again, you will need to find them).

So, to create an iOS build, you should do the following:

  1. Get a Mac computer.
  2. Get an iOS device.
  3. Create an Apple Developer account (at the time of writing this book, you can create one at https://developer.apple.com/).
  4. Install Xcode from the App Store onto your Mac.
  5. Check if you have iOS build support in Unity Install on the Unity Hub. Please refer to the Building on Android section for more information about this step:
    Figure 22.27 – Enabling iOS build support

    Figure 22.27 – Enabling iOS build support

  6. Switch to the iOS platform under Build Settings, selecting iOS and clicking the Switch Platform button:
    Figure 22.28 – Switching to iOS build

    Figure 22.28 – Switching to iOS build

  7. Click the Build button in the Build Settings window and wait.

You will notice that the result of the build process will be a folder containing an Xcode project. Unity cannot create the build directly, so it generates a project you can open with the Xcode software we mentioned previously. The step you need to follow to create a build with the Xcode version being used in this book (11.4.1) are as follows:

  1. Double-click the .xcproject file inside the generated folder:
    Figure 22.29 – Xcode project file

    Figure 22.29 – Xcode project file

  2. Go to Xcode | Preferences.
  3. In the Accounts tab, hit the + button at the bottom-left part of the window and log in with the Apple account you registered as an Apple developer:
    Figure 22.30 – Account Settings

    Figure 22.30 – Account Settings

  4. Connect your device and select it from the top-left part of the window, which should now say Generic iOS device:
    Figure 22.31 – Selecting the device

    Figure 22.31 – Selecting the device

  5. In the left panel, click the folder icon and then the Unity-iPhone settings to display the project settings.
  6. From the TARGETS list, select Unity-iPhone and click on the Signing & Capabilities tab.
  7. In the Team settings, select the options that says Personal Team:
    Figure 22.32 – Selecting a team

    Figure 22.32 – Selecting a team

  8. If you see a Failed to register bundle identifier error, just change the Bundle Identifier setting for another one, always respecting the format (com.XXXX.XXXX), and then click on Try Again until it is solved. Once you find one that works, set it in Unity (Bundle Identifier under Player Settings) to avoid needing to change it in every build.
  9. Hit the Play button at the top-left part of the window and wait for the build to complete. You might be prompted to enter your password a couple of times in the process, so please do so.
  10. When the build completes, remember to unlock the device. A prompt will ask you to do that. Note that the process won't continue unless you unlock the phone.
  11. After completion, you may see an error saying that the app couldn't be launched but that it was installed anyway. If you try to open it, it will say you need to trust the developer of the app, which you can do by going to the settings of your device.
  12. From there, go to General | Device Management and select the first developer in the list.
  13. Click the blue Trust … button and then Trust.
  14. Try to open the app again.
  15. Please remember to try another image if you're having trouble detecting the image where we instantiate the player (the Unity logo, in my case). This might vary a lot, depending on your device's capabilities.

In this section, we discussed how to build a Unity project that can run on iOS and Android, thus allowing us to create mobile apps–AR mobile apps, specifically. Like any build, there are methods we can follow to profile and debug, as we saw when we looked at PC builds, but we are not going to discuss that here. Now that we have created our first test project, we will convert it into a real game by adding some mechanics to it.

Creating a simple AR game

As we discussed previously, the idea is to create a simple game where we can move our player while moving a real-life image, and also put in some Enemy Spawners by just tapping where we want them to be, such as a wall, the floor, a table, and so on. Our player will automatically shoot at the nearest Enemy, and the enemies will shoot directly at the player, so our only task will be to move the Player so that they avoid bullets. We are going to implement these game mechanics using scripts very similar to the ones we used in this book's main project.

In this section, we will develop the following AR game features:

  • Spawning the Player and Enemies
  • Coding the Player and Enemy behavior

First, we are going to discuss how to make our Player and Enemies appear on the app, specifically in real-world positions, and then we will make them move and shoot each other to create the specified gameplay mechanics. Let's start with spawning.

Spawning the Player and Enemies

Let's start with the Player, since that's the easiest one to deal with: we will create a prefab with the graphics we want the player to have (in my case, just a cube), a Rigidbody with Is Kinematic checked (the Player will move), and an AR Tracked Image script. We will set that prefab as Tracked Image Prefab of the AR Tracked Image Manager component in the AR Session Origin object. This will put the Player on the tracked image. Remember to set the size of the Player in terms of real-life sizes. In my case, I scaled the Player to (0.05, 0.05, 0.05). Since the original cube is 1 meter in size, this means that my player will be 5x5x5 centimeters. Your Player prefab should look as follows:

Figure 22.33 – The starting "Player" prefab

Figure 22.33 – The starting "Player" prefab

The enemies will require a little bit more work, as shown here:

  1. Create a prefab called Spawner with the graphic you want your Spawner to have (in my case, a cylinder) and its real-life size.
  2. Add a custom script that spawns a prefab every few seconds, such as the one shown in the following screenshot.
  3. You will notice the usage of Physics.IgnoreCollision to prevent the Spawner from colliding with the Spawner object, getting the colliders of both objects and passing them to the function. You can also use the Layer Collision Matrix to prevent collisions, just like we did with this book's main project, if you prefer to:
    Figure 22.34 – Spawner script

    Figure 22.34 – Spawner script

  4. Create an Enemy prefab with the desired graphic (a Capsule, in my case) and a Rigidbody component with the Is Kinematic checkbox checked. This way, the Enemy will move but not with physics. Remember to consider the real-life size of the Enemy.
  5. Set the Prefab property of the Spawner so that it spawns our Enemy at your desired time frequency:
    Figure 22.35 – Configuring the Spawner

    Figure 22.35 – Configuring the Spawner

  6. Add a new SpawnerPlacer custom script to the AR Session Origin object that instantiates a prefab in the place the player tapped using the AR Raycast system, as shown in the following screenshot:
    Figure 22.36 – Placing the Spawners

    Figure 22.36 – Placing the Spawners

  7. Set the prefab of SpawnerPlacer so that it spawns the Spawner prefab we created earlier.

And that's all for the first part. If you test the game now, you will be able to tap on the detected planes in the app and see how the Spawner starts creating enemies. You can also look at the target image and see our Cube Player appear.

Now that we have the objects in the scene, let's make them do something more interesting, starting with the Enemies.

Coding the Player and Enemy behavior

The Enemy must move toward the player in order to shoot at them, so it will need to have access to the player position. Since the Enemy is instantiated, we cannot drag the Player reference to the prefab. However, the Player has also been instantiated, so we can add a PlayerManager script to the player that uses the Singleton pattern (as we did with managers). To do that, follow these steps:

  1. Create a PlayerManager script similar to the one shown in the following screenshot and add it to the Player:
    Figure 22.37 – Creating the PlayerManager script

    Figure 22.37 – Creating the PlayerManager script

  2. Now that the Enemy has a reference to the player, let's make them look at the player by adding a LookAtPlayer script, as shown here:
    Figure 22.38 – Creating the LookAtPlayer script

    Figure 22.38 – Creating the LookAtPlayer script

  3. Also, add a simple MoveForward script like the one shown in the following screenshot to make the Enemy not only look at the player but also move toward them. Since the LookAtPlayer script is making the Enemy face the Player, this script moving along the z axis is just enough:
Figure 22.39 – Creating the MoveForward script

Figure 22.39 – Creating the MoveForward script

Now, we will take care of the Player movement. Remember that our player is controlled through moving the image, so here, we are actually referring to the rotation, since the player will need to automatically look and shoot at the nearest Enemy. To do this, follow these steps:

  1. Create an Enemy script and add it to the Enemy prefab.
  2. Create an EnemyManager script like the one shown in the following screenshot and add it to an empty EnemyManager object in the scene:
    Figure 22.40 – Creating the EnemyManager script

    Figure 22.40 – Creating the EnemyManager script

  3. In the Enemy script, make sure to register the object in the all list of EnemyManager, as we did previously with WavesManager in this book's main project:
    Figure 22.41 – Creating the Enemy script

    Figure 22.41 – Creating the Enemy script

  4. Create a LookAtNearestEnemy script like the one shown in the following screenshot and add it to the Player prefab to make it look at the nearest Enemy:
    Figure 22.42 – Looking at the nearest Enemy

    Figure 22.42 – Looking at the nearest Enemy

    Now that our objects are rotating and moving as expected, the only thing missing is shooting and damaging:

  5. Create a Life script like the one shown in the following screenshot and add it to both the Player and Enemy components. Remember to set a value for the amount-of-life field. You will see this version of Life instead of needing to check if the life has reached zero every frame. We have created a Damage function to check that damage is dealt (the Damage function is executed), but the other version of this book's project also works:
    Figure 22.43 – Creating a Life component

    Figure 22.43 – Creating a Life component

  6. Create a Bullet prefab with the desired graphics, the collider with the Is Trigger checkbox on the collider checked, a Rigidbody component with Is Kinematic checked (a Kinematic Trigger Collider), and the proper real-life size.
  7. Add the MoveForward script to the Bullet prefab to make it move. Remember to set the speed.
  8. Add a Spawner script to both the Player and the Enemy components and set the Bullet prefab as the prefab to spawn, as well as the desired spawn frequency.
  9. Add a Damager script like the one shown in the following screenshot to the Bullet prefab to make bullets inflict damage on the objects it touches. Remember to set the damage:
    Figure 22.44 – Creating a Damager script – part 1

    Figure 22.44 – Creating a Damager script – part 1

  10. Add an AutoDestroy script like the one shown in the following screenshot to the Bullet prefab to make it despawn after a while. Remember to set the Destroy time:
Figure 22.45 – Creating a Damager script – part 2

Figure 22.45 – Creating a Damager script – part 2

And that's all! As you can see, we basically created a new game using almost the same scripts we used in the main game, mostly because we designed them to be generic (and the game genres are almost the same). Of course, this project can be improved a lot, but we have a nice base project upon which to create amazing AR apps.

Summary

In this chapter, we introduced the AR Foundation Unity framework, explored how to set it up, and how to implement several tracking features so that we can position virtual objects on top of real-life objects. We also discussed how to build our project so that it can run on both iOS and Android platforms, which is the only way we can test our AR apps at the time of writing. Finally, we created a simple AR game based on the game we created in the main project but modified it so that it's suitable for use in AR scenarios.

With this new knowledge, you will be able to start your path as an AR app developer, creating apps that augment real objects with virtual objects by detecting the positions of the real objects. This can be applied to games, training apps, and simulations. You may even be able to find new fields of usage, so take advantage of this new technology and its new possibilities!

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset