There are other types of gestures that the user can perform, such as drawing letters or other shapes. These gestures may be more complex than a twist or drag, so we cannot just rely on simple finger tracking, but rather we need shape recognition.
We can store, recognize, and visualize shape gestures using a GestureLibrary
instance:
GestureLibrary
by requesting one through the GestureLibraries
type:GestureLibrary library = GestureLibraries.FromPrivateFile(this, "gestures");
Load()
method:library.Load();
GestureEntries
property:string[] entries = library.GestureEntries.ToArray();
GetGestures()
method:Gesture gesture = library.GetGestures("my gesture")[0];
Bitmap
instance:var bitmap = gesture.ToBitmap(100, 100, 0, Color.Yellow);
Path
instance:var path = gesture.ToPath();
If we want to create a new gesture, we can make use of the GestureOverlayView
instance and save the gesture to the library:
<android.gesture.GestureOverlayView>
element to the layout:<android.gesture.GestureOverlayView android:id="@+id/overlay" android:layout_width="match_parent" android:layout_height="match_parent" />
Then we can access the overlay using the FindViewById()
method:
var overlay = FindViewById<GestureOverlayView>(Resource.Id.overlay);
Alternatively, we can create and add a GestureOverlayView
from code:
var overlay = new GestureOverlayView(this);
GestureStrokeType
to Multiple
:overlay.GestureStrokeType = GestureStrokeType.Multiple;
gestureLibrary.AddGesture("name", overlay.Gesture); gestureLibrary.Save();
overlay.Clear(false);
When we want to respond to a user's gesture, we can place a GestureOverlayView
parameter on the UI and wait for the user to draw on it:
GesturePerformed
event:overlay.GesturePerformed += (sender, e) => { }
var recognitions = gestureLibrary.Recognize(e.Gesture);
var guess = recognitions.FirstOrDefault(r => r.Score > 1);
Simple gesture recognition involves tracking the movement of one or more fingers across the device screen. This data is used to update the screen or manipulate objects in real time. These gestures are limited to the data that is provided every few milliseconds and usually doesn't track the entire gesture.
More advanced gestures require the entire gesture to be performed before it can be processed. These types of gestures are delayed and do not perform any direct manipulations. However, feedback to the user may be required in the form of a path being traced onto the screen.
Android provides an easy way to both record and recognize gestures through the GestureLibrary
type. A GestureLibrary
type can be obtained through one of the methods on the GestureLibraries
type, either from a raw resource of from a file on the file system.
When we have created the gesture library instance, we load or reload the gestures into memory using the Load()
method. And, if we update the library, we can persist the changes using the Save()
method.
Once loaded, we can list the gestures loaded using the GestureEntries
property, or query specific gestures using the GetGestures()
method. After we have queried a specific gesture, we can make use of the various methods to convert a gesture into a visual Path
or Bitmap
parameters.
In order to build up the library of gestures, we can either load an existing library file, or we can create and add new gestures to the library. Regardless of how gestures are created, we add them to the library using the AddGesture()
method. We pass the gesture, along with a name, to this method. After adding gestures, we need to ensure that we save the library.
Gestures can be created by building a collection of timestamped coordinates. A series of GesturePoints
are added to a GestureStroke
instance, which in turn are added to a gesture.
Another means of creating gestures is via the GestureOverlayView
instance. When displayed, the user can trace the gesture, which we can then read and save into a library. Some gestures require multiple strokes, thus we need to set the GestureStrokeType
property to Multiple
. This prevents the view from clearing the surface as soon as the user lifts all their fingers off the view.
Now that we have a collection of gestures in a gesture library, we need to be able to recognize user gestures. This is done by passing the gesture to be recognized to the Recognize()
method of the library. The Recognize()
method returns a collection of scored predictions or matches.
Each prediction is provided a score or match rating, and the match collection is ordered by score from the highest match to the lowest. Predictions with scores above 1.0 indicate good matches, and scores below 1.0 are poor matches. However, any threshold can be used to determine the desired match.
When using the GestureOverlayView
instance to draw gestures, there are several events that we can subscribe to. One of the most common events is GesturePerformed
, which is raised as soon as the view detects that the user has finished drawing a gesture.
The GestureOverlayView
instance can also be used to overlay regular views, and detect gestures when the user interacts with those views. As the GestureOverlayView
instance is simply a FrameLayout
instance, child views can be added to it in the same way views are added to other layouts.
As child views also will be processing events, we need to ensure that as soon as the gesture overlay view detects a gesture, it prevents the child views from processing touch events. This is done by setting the EventsInterceptionEnabled
property to true
. This is useful when the child view is a scrollable view, as the GestureOverlay
view prevents the scrollable view from scrolling during gestures.
If the child view is scrollable, we need to indicate to the gesture overlay view what direction the view scrolls in. We do this by setting the Orientation
property to the child view scroll direction. This allows scroll gestures to be correctly recognized as scrolling and not a custom gesture.