Chapter     13

Handling Device Input

Although our bowling game is up and running on iOS, it’s not quite in a playable state, because we don’t have any working player controls. Which brings us to the issue of input handling, perhaps the most obvious difference between desktop and mobile games.

Instead of reading the mouse or keyboard, an iOS game has to use one of its device sensors, typically the touchscreen (tapping and swiping) or accelerometer (shaking and tilting). We’ll opt for touchscreen input to control the bowling ball, but we’ll do some shake detection with the accelerometer and play with the device camera a little bit, too.

The project for this chapter on http://learnunity4.com/ has the script changes introduced in this chapter, and no other assets are added. This is true for the remainder of this book. It’s all scripting from here on out!

The Touch Screen

UnityGUI already works with the touchscreen in iOS, so if you Build and Run our bowling game on a device right now, or test it with the Unity Remote in the Editor, you can operate the initial menu by tapping the buttons. When you tap the Play button on the pause menu, the bowling ball drops. So far, so good.

But what about the bowling ball control? Input.GetAxis does return values when swiping on the screen, but the results are unlikely to be exactly what you want (in older versions of Unity iOS, Input.GetAxis was not functional at all). In any case, we need to first define how the control will work in the iOS version of the bowling game.

Swipe the Ball

For Fugu Bowl, we’ll adopt the touchscreen control that I use for HyperBowl. The player pushes the ball in a certain direction by swiping along the screen in that direction. Swiping up on the screen will push the ball forward, swiping down will push the ball back, swiping left will roll it left, and swiping right will roll it right.

The swipe control scheme is similar enough to the mouse controls that we can work within the framework of the existing FuguBowlForce script. Bring up that script and add some new public variables, swipepowerx and swipepowery, so we can adjust the swipe power, just like we did with the mouse control (Listing 13-1).

Listing 13-1.  Adjustment Variables for Swipe Control in FuguBowlForce.js

var swipepowerx:float = 0.1;
var swipepowery:float = 0.08;

We added new variables instead of reusing mousepowerx and mousepowery so that we can switch between the standalone and iOS build targets without clobbering the power adjustment values for each platform. The properties for both will always show up in the Inspector View (Figure 13-1).

9781430248750_Fig13-01.jpg

Figure 13-1. Control adjustments for both desktop and iOS in FuguBowlForce.js

The big change is our CalcForce function, which gets called by our Update callback once per frame. On iOS, instead of calling the static function Input.GetAxis to see how much the mouse has been moved, CalcForce calls the static function Input.GetTouch to check for swipes. The UNITY_IPHONE preprocessor definition makes sure the new code exists only on iOS and the old code on any other platform (Listing 13-2).

Listing 13-2.  Detecting Swipes in FuguBowlForce.js

function CalcForce() {
        var deltatime:float = Time.deltaTime;
#if UNITY_IPHONE
        if (Input.touchCount > 0) {
                // Get movement of the finger since last frame
                var touch:Touch = Input.GetTouch(0);
                if (touch.phase == TouchPhase.Moved) {
                         var touchPositionDelta:Vector2 = touch.deltaPosition;
                         forcey = swipepowery*touchPositionDelta.y/deltatime;
                         forcex = swipepowerx*touchPositionDelta.x/deltatime;
                }
        }
#else
        forcey = mousepowery*Input.GetAxis("Mouse Y")/deltatime;
        forcex = mousepowerx*Input.GetAxis("Mouse X")/deltatime;
#endif
}

}

Like Input.GetAxis, Input.GetTouch returns information registered since the previous frame, so you can call it in CalcForce, which gets called from Update, thus once each frame.

Input.GetTouch returns a struct of type Touch, which describes the touch event, whether the finger was pressed, released, or moved, and if moved, by how many pixels.

We’re now dealing with multitouch screens, so Input.GetTouch has one parameter, an integer indicating which of the latest Input.Touch events to return. The number of touch events is available in the static variable Input.touchCount, so we can process all touch events is a loop like this:

  
for (var i:int=0; i < Input.touchCount; ++i) {
        var touch:Touch = Input.GetTouch(i);
// do your stuff here
}

But to keep things simple here, we’re only interested in one finger, so just check if there’s one or more Touch events available, and if so, just check the first one:

  
if (Input.touchCount > 0) {
        var touch:Touch = Input.GetTouch(0);

Touch events can signify different finger actions: a press, release, or drag. This can be checked by inspecting the phase property of the Touch object, which is a TouchPhase enumeration. We’re only interested in Touch events that signify a finger has dragged along the screen, so check if the touch phase is TouchPhase.Moved:

  
if (touch.phase == TouchPhase.Moved) {

If it is, then calculate the force used to push the bowling ball. It’s similar to the mouse control, but instead of multiplying by Input.GetAxis, multiply by the deltaPosition property of the Touch event, which is the number of pixels the finger has dragged across:

  
var touchPositionDelta:Vector2 = touch.deltaPosition;
forcey = powery*touchPositionDelta.y/deltatime;
forcex = powerx*touchPositionDelta.x/deltatime;

Notice the deltaPosition property is a Vector2, which is just like a Vector3, but with only x and y properties, no z value.

Now if you click Play in the Editor and test with the Unity Remote, or if you perform a Build and Run on a device, the ball will roll in the direction you swipe.

The complete FuguBowlForce script with the touchscreen additions is available in the project for this chapter on http://learnuninty4.com/.

Tap the Ball

Although our bowling game doesn’t care where on the screen the swiping takes place, many games require the ability to detect if a specific GameObject has been touched. On desktop and web platforms, mouse events over GameObjects with Collider Components can be detected with callbacks such as OnMouseDown, OnMouseUp, and OnMouseOver. For example, the OnMouseDown callback in a script is invoked at the beginning of a mouse click that takes place over the script’s GameObject, or, rather, the GameObject’s Collider Component.

Note   The OnMouse callbacks also work on GameObjects with GUIText and GUITexture Components. This was the closest thing to a built-in GUI system in Unity before UnityGUI was introduced.

To demonstrate how the OnMouse callbacks work, create a new JavaScript named FuguDebugOnMouse and attach it to the Ball GameObject (Figure 13-2).

9781430248750_Fig13-02.jpg

Figure 13-2. Ball with the FuguDebugOnMouse script attached

Then add the contents of Listing 13-3 to the script.

Listing 13-3.  Detecting a Mouse Click in FuguDebugMouse.js

#pragma strict

function OnMouseDown () {
        Debug.Log("GameObject "+ gameObject.name + " was touched");
}

Now the script contains an OnMouseDown callback that logs a message about the GameObject getting touched. When you run the game in the Editor and click the Ball, the debug message appears in the Console View (Figure 13-3).

9781430248750_Fig13-03.jpg

Figure 13-3. Debug message demonstrating the OnMouseDown callback

Unity iOS invokes the OnMouse callbacks in response to touch events, except in the cases where there’s no obvious mapping. For example, what’s the touch equivalent of OnMouseOver, which is called when the mouse hovers over a GameObject? But it makes sense to call OnMouseDown when a GameObject is tapped (and OnMouseUp when the finger stops touching the screen). And this is what happens, as you can see if you run the game again in the Editor with the Unity Remote or if you Build and Run on a device. In either case, tapping the Ball will produce the same debug message you saw when clicking it.

Although it’s convenient that the OnMouse callbacks work for touches, at least to this extent, it’s useful to know how to implement your own tap-object detection. It’s a straightforward combination of checking for a tap and then performing a raycast from the screen position into the 3D world along the camera direction. Raycasting is the process of finding the first object that a ray intersects (as you may recall from high school math, a ray has a starting position and a direction, like an arrow).

To demonstrate, let’s create a new JavaScript named FuguTap, attach it to the Main Camera (Figure 13-4), and add the contents of Listing 13-4 to the script.

9781430248750_Fig13-04.jpg

Figure 13-4. Main Camera with the FuguOnTap script attached

Listing 13-4.  A Script Mimicking OnMouseDown Events

#pragma strict

#if UNITY_IPHONE
function Update () {
                for (var i = 0; i < Input.touchCount; ++i) {
                         if (Input.GetTouch(i).phase == TouchPhase.Began) {
                                 var ray:Ray = camera.ScreenPointToRay (Input.GetTouch(i).position);
                                 var hit:RaycastHit;
                                 if (Physics.Raycast (ray,hit,camera.far,camera.cullingMask)) { hit.collider.SendMessage("OnTap",SendMessageOptions.DontRequireReceiver);
                                 }
                         }
                }
}
#endif

The entire script consists of an Update callback, that loops through all the touch events registered in the past frame. But in this case, only the beginning of a tap is of interest, so only touches that are in the phase TouchPhase.Began are considered. The screen coordinates of each such touch is extracted and passed to the Camera function ScreenPointToRay to form a Ray that originates at the screen position and projects along the Camera direction into the 3D world.

The Ray is passed in as the first argument to the static function Physics.Raycast to see if the Ray intersects a Collider Component of any GameObject. Information about the first intersection, if any, is returned in the RayCastHit struct that is passed in as the second argument. The third argument is the raycast distance, and the fourth argument is the set of layers that will be tested for intersection. You restrict the possible raycast results by supplying the Camera far plane distance as the raycast distance and the Camera culling mask (set of visible layers) as the set of layers to test for intersection.

If the call to Physics.Raycast returns true, meaning there’s been an intersection, then you retrieve the intersecting Collider Component and call SendMessage on it to invoke OnTap callbacks in attached scripts (calling SendMessage on a Component is equivalent to calling SendMessage on the Component’s GameObject).

Notice that the call to SendMessage includes an optional second argument, SendMessageOptions.DontRequireReceiver. Without that argument, Unity will report an error if the message is sent to a GameObject that doesn’t have any matching functions, or, in this case, any OnTap functions.

The entire Update callback is wrapped in a #ifdef UNITY_IPHONE, because it’s only intended for iOS touchscreen input. Likewise, you can now use UNITY_IPHONE to rename the OnMouseDown callback to OnTap, so it will be called by the FuguOnTap script (Listing 13-5).

Listing 13-5.  The FuguDebugOnMouse Script with an Alternate Callback Name for iOS

#pragma strict
if !UNITY_IPHONE
function OnMouseDown () {
#else
function OnTap () {
#endif
        Debug.Log("GameObject "+ gameObject.name + " was touched");
}

Now if you run the game again and tap the Ball, you’ll still see the debug message, but it will result from the OnTap messages sent from FuguOnTap.

The Accelerometer

Every iOS device has an accelerometer that measures acceleration in three different directions. The three values can be examined in the static variable Input.accelerometer, which is a Vector3.

The x value corresponds to acceleration that runs left to right along the face of the device. The y value represents the acceleration running from top to bottom along the face of the device, and the z value is the acceleration from the front to the back of the device. Starting with Unity 4, the accelerometer values are adjusted for the device orientation, so if you’re holding the device in landscape mode, the accelerometer x values still run from left to right.

Debug the Accelerometer

A time-honored way to figure out how code works is to print stuff out. Let’s create a new JavaScript called FuguDebugAccel and attach it to a new GameObject named DebugAccel in the bowling scene. Add the Update function as shown in Listing 13-6 to the script.

Listing 13-6.  Code to Print Out Accelerometer Values

function Update () {
        Debug.Log("accel x: "+Input.acceleration.x+" y: "+Input.acceleration.y+"                                                                         z: "+Input.acceleration.z);
}

The Update function takes the x, y, and z accelerometer values and concatenates them into a somewhat readable String. Debug.Log sends that String to the Console View if you’re testing in the Unity Editor (with the Unity Remote) or to the Xcode debug area if you’re running in the iOS Simulator or on a device.

Try tilting the device around and see how the values change. For example, the Debug.Log output shown in Figure 13-5 results from placing the device flat on a table. The x and y values are nearly 0 and the z value is close to −1. If the device is flipped upside down it would be close to 1.

9781430248750_Fig13-05.jpg

Figure 13-5. Debug.Log output of accelerometer values

The acceleration values are in units of gravity. At rest, each accelerometer value ranges from −1 to 1, where 1 is full earth gravity (9.8 meters per second squared). Now try shaking the device. You should see the numbers spike up and down.

Detect Shakes

An issue with our pause menu on iOS is that we can’t pause with the ESC key, since there’s no ESC key on our devices (although Android devices typically have a Back button that behaves like an ESC key).

It’s straightforward to add a pause button to the screen to FuguPause script by just adding a GUI.Button call inside the OnGUI function:

  
if (GUI.Button(Rect(0,0,100,50), "Pause") { PauseGame(); }

However, I don’t like to have buttons on the screen during gameplay, especially on really small screens like the iPhone. So let’s take advantage of the alternate input that iOS devices can provide and pause by shaking the device.

From our previous exercise in printout out accelerometer values, we saw that shaking the device causes at least one of the x, y, or z values of Input.acceleration to spike up or down. So we can check for a shake by checking if any of the values are much larger than 1 or much less than −1, for example. But we don’t have to be very exact, so we can just take the square of the magnitude (length) of the vector:

  
if (Input.acceleration.sqrMagnitude>shakeThreshold)

Getting the sqrMagnitude of the Vector3 is faster than getting the magnitude, since the magnitude of a vector is the square root of x2 + y2 + z2, and this way you avoid the square root operation, which is relatively expensive (remember learning to calculate square roots by hand in high school?).

So let’s start by adding a public variable for the shake threshold to the FuguPause script:

  
var shakeThreshold:float = 5;

Now you can tweak that property in the Inspector View (Figure 13-6) and make it more or less sensitive to shakes.

9781430248750_Fig13-06.jpg

Figure 13-6. Adjusting the shake -to-pause threshold in the Inspector View

Next, add the check of Input.acceleration as a UNITY_IPHONE replacement for the check of the ESC key (Listing 13-7).

Listing 13-7.  Adding Shake to Pause in the FuguPause.js

var shakeThreshold:float = 5;

function Update() {
#if UNITY_IPHONE
        if (Input.acceleration.sqrMagnitude>shakeThreshold)
#else
        if (Input.GetKeyDown(KeyCode.escape))
#endif
        {
                switch (currentPage) {
                case Page.None: PauseGame(); break;     // if the pause menu is not displayed, then pause
                case Page.Main: UnPauseGame(); break; // if the main pause menu is displaying,                                                                                        then unpause
                default: currentPage = Page.Main;     // any subpage goes back to main page
                }
        }
}

That’s all you need. Now when you test with the Unity Remote or on a device, shaking the device will pause the game! The augmented FuguPause script (just two lines of new code!) is in the project for this chapter on http://learnunity4.com/.

The Camera

Although Unity doesn’t have much built-in support for iOS cameras, Unity does have cross-platform support for reading webcam video into a texture.

Reading the WebCam

Unity has a special kind of texture (like Texture2D, a subclass of Texture) called WebCamTexture that displays video from an attached hardware camera. It works on multiple platforms, including iOS. Let’s give it a try. Create a JavaScript called FuguWebCam and add the Start function given in Listing 13-8. The Start function creates a WebCamTexture, assigns the texture to the Material of the GameObject you’ve attached the script to, and then calls Play to start updating the texture from the camera video.

Listing 13-8.  Playing a WebCamTexture in FuguWebCam.js

function Start () {
var webcamTexture:WebCamTexture = WebCamTexture();
renderer.material.mainTexture = webcamTexture;
webcamTexture.Play();
}

Now attach the script to the Ball GameObject. If you click Play in the Editor you’ll see video from the Mac camera appear on the ball. And if you Build and Run on a device, you’ll see the same thing, only using the device camera. Figure 13-7 is a screenshot of the game on my iPod touch as I’m pointing the camera at my whiteboard.

9781430248750_Fig13-07.jpg

Figure 13-7. A WebCamTexture rendered on the bowling ball

Explore Further

All in all, it didn’t take much work (compared to the packaging effort in the previous chapter) to implement touchscreen input for the bowling game, although in practice you’ll spend a lot of time testing and adjusting controls for a game to get it just right. In any case, with the shake-to-pause feature, the functional port of the bowling game is complete. In other words, the iOS version game is now playable!

And despite the fact there isn’t a real use for the device camera in this game, we did activate it using Unity’s WebCamTexture class to render a video texture on the bowling ball. Gimmicky, but game development is very much about learning! Especially in iOS, there’s an opportunity to play around with a lot of different control schemes and input sensors.

Scripting Reference

Almost all the Unity features introduced in this chapter are members of the “Input” class. In fact, the Scripting Reference documentation on Input includes an overview of the iOS input features.

After reading the overview of the Input class, you should read the pages for each of the functions used in this chapter, in particular “Input.GetTouch” (and the associated “Touch” struct), which has code samples showing how to loop through all the Touch events in each frame, and “Input.acceleration”, which has code samples showing how to check acceleration values in every frame.

One thing to keep in mind, the accelerometer can generate multiple values per frame. For most cases, it should be sufficient to just sample Input.accelerometer once per frame, but if finer-grained sampling is required, you can access the variables Input.accelerationEventCount and Input.accelerationEvents to obtain all acceleration events.

The Input class provides access to other iOS sensors. The static variables “Input.gyro”, “Input.compass” and “Input.location” return data from the gyroscope, compass and location services, respectively. In addition to the pages documenting those variables, you should check out the pages on the classes of those variables: “Gyroscope”, “Compass” and “LocationServices” classes. The documentation is sparse, but you can find code samples in the individual class functions. The static variable “Input.compensateSensors” controls whether the accelerometer, gyroscope and compass data are adjusted for the screen orientation.

Besides testing object selection, raycasting is used a lot in games and graphics, so you should familiarize yourself with the page on “Physics.Raycast”, which has several code examples. For example, in HyperBowl, I use raycasting to keep the Main Camera from dipping below the ground and to set the initial positions of GameObjects to rest just on top of the ground. In both cases, I raycast down from the GameObject position toward the ground and check the resulting distance.Finally, we covered the use of the WebCamTexture class. The page on “WebCamTexture” has a lot of stuff, but click the links to its Play, Stop, and Pause functions (the functions you’d most commonly use) to find code samples on each of those pages. Also check out the link to the “WebCamTexture” constructor. That page lists a variety of constructors that allow you to customize the texture size for different resolutions.

iOS Developer Library

It’s a good idea to read the iOS Reference Library in addition to, or even before, reading the Unity Script Reference, so you’ll know what iOS capabilities are available to be exposed through Unity and how Unity classes and functions map to their iOS counterparts.

You can view the iOS Reference Library, no login required, at http://developer.apple.com/library or from the Xcode Organizer, and from there select the Guides tab and browse the guides, in particular the following:

The “Event Handling Guide” provides an overview of touch events, which relate to Unity’s Input.GetTouch and the Touch class, and motion events, which relate to Input.acceleration and the other Input accelerometer variables.

The guide “Camera Programming Topics for iOS” describes the UIImagePickerController class used by the Prime31 Etcetera plug-ins.

The “AV Foundation Programming Guide” describes the video capture functions most likely used for the WebCamTexture class in Unity.

Asset Store

The Unity Input class gives us access to basic touch information, but it doesn’t provide access to the iOS Gesture Recognizers, which detect higher level “gestures,” such as swiping and pinching. The Asset Store comes to the rescue again, offering third-party packages that provide high-level gesture detection. I use the Finger Gestures package, which has a nice callback system for detecting and handling various gestures.

For example, in HyperBowl I wanted the pause menu to come up when the player pinches the screen (two fingers on the screen coming together). With the Finger Gestures package, it just involves adding a callback for the gesture in the pause menu script:

  
function OnGesturePinchEnd(pos1:Vector2,pos2:Vector2) {
        PauseGame();
}

and then a line in the Start or Awake function that adds the callback to the FingerGestures callback list for that gesture:

  
FingerGestures.OnPinchEnd += OnGesturePinchEnd;

Unity doesn’t provide script access to everything available in iOS, and that’s where plugins come in. The Unity plugin system allows you to add new script functions that access “native” code. So, generally speaking, if you can code something in C, C++, or Objective-C, you can make a Unity plugin for it. For example, in the Unity Asset Store, you’ll find there are plug-ins built around mobile ad SDKs and plugins for accessing iOS features.

Plugins are installed in the Assets/Plugins folder, which, as we mentioned before, is a good place to store scripts that have to be loaded before any other scripts. They also often require manual integration steps in Xcode, like adding entries to the Info.plist file or installing additional code libraries (preventing Build and Run from Unity—you have to Build, modify the Xcode project, then Run from Xcode). And then there’s always the risk that multiple plugins installed in the same Unity project will have conflicts.

That’s why I like to use the plug-ins provided by Prime31 Studios at http://prime31.com/unity. They provide a large variety of plugins that can coexist in the same Unity project, and all Xcode modifications are performed by a Unity postprocessor script, so Build and Run from Unity will still work and no messing around in Xcode is required.

For example, the Prime31 Etcetera plugin, available on the Asset Store, has a bunch of assorted features, including access to the iOS Photos gallery and the device camera, with one function call, like this:

  
EtceteraBinding.promptForPhoto(1.0);

The screenshot in Figure 13-8 shows the resulting prompt in my Fugu Maze app (where I use the photo chooser to allow players to customize the maze wall texture). If you want to try it out, download Fugu Maze Free from the App Store or one of the free HyperBowl lanes (I use the photo chooser to customize the bowling ball texture).

9781430248750_Fig13-08.jpg

Figure 13-8. The Prime31 Etcetera plug-in photo chooser on an iPad

Another Prime31 plugin that makes available additional device input is the iCade plugin (only available on the Prime31 web site). The iCade is a retro-style arcade cabinet that provides joystick input to iOS devices through a Bluetooth connection. The iCade essentially pretends it’s a Bluetooth keyboard, so any code using the iCade plugin is similar to the code for keyboard controls. If we added code for iCade controls to the CalcForce function of our bowling ball code, it would look something like the snippet in Listing 13-9 (this sample is actually taken from HyperBowl).

Listing 13-9.  iCade Additions to CalcForce Using the Prime31 iCade Plug-In

var yinput:float = 0;
if (iCadeBinding.state.JoystickUp) yinput = 1;
if (iCadeBinding.state.JoystickDown) yinput = -1;
var xinput:float = 0;
if (iCadeBinding.state.JoystickLeft) xinput = 1;
if (iCadeBinding.state.JoystickRight) xinput = -1;
var deltatime:float = Time.deltaTime;
forcey += iCadePowery*yinput/deltatime;
forcex += iCadePowerx*xinput/deltatime;

Just as if we’re checking if four different keys were pressed, we’re checking if the joystick is moved in any of four directions. The joysticks aren’t any more sensitive than that.

The product information for the iCade is on the Ion Audio site at http://www.ionaudio.com/products/details/icade, and the ThinkGeek blog lists some of the games that work with the iCade (http://www.thinkgee.com/product/e762/). If you get your devices running on the iCade, be sure to contact that blog to have your app listed!

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset