Responding to custom user gestures

There are other types of gestures that the user can perform, such as drawing letters or other shapes. These gestures may be more complex than a twist or drag, so we cannot just rely on simple finger tracking, but rather we need shape recognition.

How to do it...

We can store, recognize, and visualize shape gestures using a GestureLibrary instance:

  1. We access a particular GestureLibrary by requesting one through the GestureLibraries type:
    GestureLibrary library = GestureLibraries.FromPrivateFile(this, "gestures");
  2. When we want to load the gestures, or refresh the library, we invoke the Load() method:
    library.Load();
  3. Then, we can list all the gestures in the library using the GestureEntries property:
    string[] entries = library.GestureEntries.ToArray();
  4. To read the data about a specific gesture, we use the GetGestures() method:
    Gesture gesture = library.GetGestures("my gesture")[0];
  5. With the gesture, we can visualize the pattern as a Bitmap instance:
    var bitmap = gesture.ToBitmap(100, 100, 0, Color.Yellow);
  6. We can also visualize the gesture strokes as a Path instance:
    var path = gesture.ToPath();

If we want to create a new gesture, we can make use of the GestureOverlayView instance and save the gesture to the library:

  1. First, we add a <android.gesture.GestureOverlayView> element to the layout:
    <android.gesture.GestureOverlayView
      android:id="@+id/overlay"
      android:layout_width="match_parent"
      android:layout_height="match_parent" />

    Then we can access the overlay using the FindViewById() method:

    var overlay = FindViewById<GestureOverlayView>(Resource.Id.overlay);

    Alternatively, we can create and add a GestureOverlayView from code:

    var overlay = new GestureOverlayView(this);
  2. To prevent the drawn gesture from disappearing as soon as the user has finished, we can set the GestureStrokeType to Multiple:
    overlay.GestureStrokeType = GestureStrokeType.Multiple;
  3. After the user has drawn the gesture on the overlay, we can save the gesture to the library:
    gestureLibrary.AddGesture("name", overlay.Gesture);
    gestureLibrary.Save();
  4. To allow a new gesture to be drawn, we can clean up the overlay:
    overlay.Clear(false);

When we want to respond to a user's gesture, we can place a GestureOverlayView parameter on the UI and wait for the user to draw on it:

  1. To respond to a gesture that the user has drawn, we can attach a handler to the GesturePerformed event:
    overlay.GesturePerformed += (sender, e) => {
    }
  2. Then, we check the gesture against the library:
    var recognitions = gestureLibrary.Recognize(e.Gesture);
  3. In the result, we can find the best guess:
    var guess = recognitions.FirstOrDefault(r => r.Score > 1);

How it works...

Simple gesture recognition involves tracking the movement of one or more fingers across the device screen. This data is used to update the screen or manipulate objects in real time. These gestures are limited to the data that is provided every few milliseconds and usually doesn't track the entire gesture.

More advanced gestures require the entire gesture to be performed before it can be processed. These types of gestures are delayed and do not perform any direct manipulations. However, feedback to the user may be required in the form of a path being traced onto the screen.

Note

Some gestures are more complex and require the entire gesture to be drawn before it can be processed.

Android provides an easy way to both record and recognize gestures through the GestureLibrary type. A GestureLibrary type can be obtained through one of the methods on the GestureLibraries type, either from a raw resource of from a file on the file system.

When we have created the gesture library instance, we load or reload the gestures into memory using the Load() method. And, if we update the library, we can persist the changes using the Save() method.

Note

Both the Load() and Save() methods need to be invoked manually in order to load and save the gesture library, respectively.

Once loaded, we can list the gestures loaded using the GestureEntries property, or query specific gestures using the GetGestures() method. After we have queried a specific gesture, we can make use of the various methods to convert a gesture into a visual Path or Bitmap parameters.

In order to build up the library of gestures, we can either load an existing library file, or we can create and add new gestures to the library. Regardless of how gestures are created, we add them to the library using the AddGesture() method. We pass the gesture, along with a name, to this method. After adding gestures, we need to ensure that we save the library.

Tip

Adding multiple gestures with the same name can improve the chances of the drawn gesture being correctly recognized.

Gestures can be created by building a collection of timestamped coordinates. A series of GesturePoints are added to a GestureStroke instance, which in turn are added to a gesture.

Another means of creating gestures is via the GestureOverlayView instance. When displayed, the user can trace the gesture, which we can then read and save into a library. Some gestures require multiple strokes, thus we need to set the GestureStrokeType property to Multiple. This prevents the view from clearing the surface as soon as the user lifts all their fingers off the view.

Note

When adding the GestureOverlayView instance to the .axml file, the fully qualified name, android.gesture.GestureOverlayView, must be used as it is not part of the usual Android.Widget namespace.

Now that we have a collection of gestures in a gesture library, we need to be able to recognize user gestures. This is done by passing the gesture to be recognized to the Recognize() method of the library. The Recognize() method returns a collection of scored predictions or matches.

Each prediction is provided a score or match rating, and the match collection is ordered by score from the highest match to the lowest. Predictions with scores above 1.0 indicate good matches, and scores below 1.0 are poor matches. However, any threshold can be used to determine the desired match.

Tip

Gesture predictions with a score of above 1.0 usually indicate good matches.

When using the GestureOverlayView instance to draw gestures, there are several events that we can subscribe to. One of the most common events is GesturePerformed, which is raised as soon as the view detects that the user has finished drawing a gesture.

There's more...

The GestureOverlayView instance can also be used to overlay regular views, and detect gestures when the user interacts with those views. As the GestureOverlayView instance is simply a FrameLayout instance, child views can be added to it in the same way views are added to other layouts.

As child views also will be processing events, we need to ensure that as soon as the gesture overlay view detects a gesture, it prevents the child views from processing touch events. This is done by setting the EventsInterceptionEnabled property to true. This is useful when the child view is a scrollable view, as the GestureOverlay view prevents the scrollable view from scrolling during gestures.

If the child view is scrollable, we need to indicate to the gesture overlay view what direction the view scrolls in. We do this by setting the Orientation property to the child view scroll direction. This allows scroll gestures to be correctly recognized as scrolling and not a custom gesture.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.189.2.122