CS453 | mobile programming
  • outline
  • projects
  • syllabus
  • links

Android Gestures

FROM http://developer.android.com/resources/articles/gestures.html

Touch screens are a great way to interact with applications on mobile devices. With a touch screen, users can easily tap, drag, fling, or slide to quickly perform actions in their favorite applications.

  • Android framework makes it's easy to recognize simple actions, like a swipe,
  • new gestures API in Android 1.6. lets developer recognize more complicated gestures.
  • package android.gesture, : store, load, draw, and recognize gestures.

(download the source code of the examples)

 

Creating a gestures library

Android 1.6 and higher SDK platforms include a new application pre-installed on the emulator, called Gestures Builder. You can use this application to create a set of pre-defined gestures for your own application. It also serves as an example of how to let the user define his own gestures in your applications. You can find the source code of Gestures Builders in the samples directory of each SDK platform. In our example we will use Gestures Builder to generate a set of gestures for us (make sure to create an AVD with an SD card image to use Gestures Builder.) The screenshot below shows what the application looks like after adding a few gestures (showing 2 different sets):

As you can see, a gesture is always associated with a name. That name is very important because it identifies each gesture within your application. The names do not have to be unique. Actually it can be very useful to have several gestures with the same name to increase the precision of the recognition. Every time you add or edit a gesture in the Gestures Builder, a file is generated on the emulator's SD card, /sdcard/gestures. This file contains the description of all the gestures, and you will need to package it inside your application inside the resources directory, in /res/raw.

move the generated /sdcard/gestures file to your /rese/raw folder in your project!!!   

 

 

Loading the gestures library

given a set of pre-defined gestures, you must load it inside your application. Assuming the name of the file is R.raw.spells (res/raw/spells.xml)

mLibrary = GestureLibraries.fromRawResource(this, R.raw.spells);
if (!mLibrary.load()) {
    finish();
}

Structure of a Gestures Library where saved to raw/gesture.xml

gestures file

Recognizing gestures

To start recognizing gestures in your application, all you have to do is add a GestureOverlayView to your XML layout:

i.e. res/layout/main.xml    contains

<android.gesture.GestureOverlayView
    android:id="@+id/gestures"
    android:layout_width="fill_parent"
    android:layout_height="0dip"
    android:layout_weight="1.0" />

NOTE: you must use android.gesture.GestureOverlayView --- its fully qualified name.

What is GestureOverlayView?

  • simple drawing board on which the user can draw his gestures.
  • can alter properties:
    • color,
    • width of the stroke used to draw gestures,

Register Listener for User Drawing on GestureOverlayView

  • Commonly used listener is GestureOverlayView.OnGesturePerformedListener,
    GestureOverlayView gestures = (GestureOverlayView) findViewById(R.id.gestures);
    gestures.addOnGesturePerformedListener(this);

Listener Code -- class implementing GestureOverlayView.OnGesturePerformedListener RECOGNIZE

  • you will get sent gesture in the onGesturePerformed method:
  • You should call GestureLibrary.recognize(gesture) to get a list of Prediction instances, each with a name - the same name you entered in the Gestures Builder - and a score. The list is sorted by descending scores; the higher the score, the more likely the associated gesture is the one the user intended to draw. The following code snippet demonstrates how to retrieve the name of the first prediction:
    public void onGesturePerformed(GestureOverlayView overlay, Gesture gesture) {
        ArrayList<prediction> predictions = mLibrary.recognize(gesture); //this methods asks to recognize the gesture against loaded gesture library

        // We want at least one prediction
        if (predictions.size() > 0) {
            Prediction prediction = predictions.get(0); //get the 1st prediction auto generated for you by Android
            // We want at least some confidence in the result
            if (prediction.score > 1.0) {
                // Show the spell
                Toast.makeText(this, prediction.name, Toast.LENGTH_SHORT).show();
            }
        }
    }

NOTE: 1st prediction is taken into account only if it's score > 1.0

NOE: Scores < 1.0 are typically poor matches.

Gestures overlay

In the example above, the GestureOverlayView was used as a normal view, embedded inside a LinearLayout. However, as its name suggests, it can also be used as an overlay on top of other views. This can be useful to recognize gestures in a game or just anywhere in the UI of an application. In the second example, called GesturesListDemo, we'll create an overlay on top of a list of contacts. We start again in Gestures Builder to create a new set of pre-defined gestures: Add, Remove, etc.

 

And here is what the XML layout looks like:

<android.gesture.GestureOverlayView             ---GestureOverlayView is on top of a ListView---
    xmlns:android="http://schemas.android.com/apk/res/android"
    android:id="@+id/gestures"
    android:layout_width="fill_parent"
    android:layout_height="fill_parent"
   
    android:gestureStrokeType="multiple" -- this allows multiple strokes in gesture
    android:eventsInterceptionEnabled="true" -- this means get events from child(ListView) once knows user drawing
    android:orientation="vertical"> -- this means scroll orientation of view underneath is vertical thus, any horizontal drawing know right away gesture

    <ListView
        android:id="@android:id/list"  
        android:layout_width="fill_parent"
        android:layout_height="fill_parent"  />

</android.gesture.GestureOverlayView>

In this application, the gestures view is an overlay on top of a regular ListView. The overlay also specifies a few properties that we did not need before:

  • gestureStrokeType: indicates whether we want to recognize gestures made of a single stroke or multiple strokes. Since one of our gestures is the "+" symbol, we need multiple strokes

  • eventsInterceptionEnabled: when set to true, this property tells the overlay to steal the events from its children as soon as it knows the user is really drawing a gesture. This is useful when there's a scrollable view under the overlay, to avoid scrolling the underlying child as the user draws his gesture

  • orientation: indicates the scroll orientation of the views underneath. In this case the list scrolls vertically, which means that any horizontal gestures (like action_delete) can immediately be recognized as a gesture. Gestures that start with a vertical stroke must contain at least one horizontal component to be recognized. In other words, a simple vertical line cannot be recognized as a gesture since it would conflict with the list's scrolling.

The code used to load and set up the gestures library and overlay is exactly the same as before. The only difference is that we now check the name of the predictions to know what the user intended to do:

 

Listener Code - To get name of predicted Gesture

CODE to test Which Prediciton -- by Gesture Name is made
public void onGesturePerformed(GestureOverlayView overlay, Gesture gesture) {
    ArrayList<Prediction> predictions = mLibrary.recognize(gesture);
    if (predictions.size() > 0 && predictions.get(0).score > 1.0) {
        String action = predictions.get(0).name; //get the name of the predicted gesture
        if ("action_add".equals(action)) {
            Toast.makeText(this, "Adding a contact", Toast.LENGTH_SHORT).show();
        } else if ("action_delete".equals(action)) {
            Toast.makeText(this, "Removing a contact", Toast.LENGTH_SHORT).show();
        } else if ("action_refresh".equals(action)) {
            Toast.makeText(this, "Reloading contacts", Toast.LENGTH_SHORT).show();
        }
    }
}

The user is now able to draw his gestures on top of the list without interfering with the scrolling:

The overlay even gives visual clues as to whether the gesture is considered valid for recognition. In the case of a vertical overlay, for instance, a single vertical stroke cannot be recognized as a gesture and is therefore drawn with a translucent color:



SETUP Emulator to Contain Gestures Builder App (built-in with eclipse emulator)

 

guesture builder app<<<Set Gestures Builder built-in appRE

 

Create Gestures using Gesture Builder and putting file in res/raw

 

User Swipes from Right to Left

swiping right to left

AFTER GESTURE RECOGNIZED as SwipeRToL

result swipeRtoL detected

Gesture EXERCISE

Step 1: In this example, I created using the Gesture Builder and the steps above a Gesture called "SwipeRToL" that captures a few different horizontal strokes starting at right and ending left.

 

Step 2: Add the gesture file using the File Explorer to "pull a file from the device" /mnt/sdcard/gestures to the project's
directory res/raw folder.

gestures file

Step 3: Add the GestureOverlay to the Activity's layout (main.xml)

<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
 android:layout_width="fill_parent"
 android:layout_height="fill_parent"
 android:orientation="vertical" >

<TextView android:layout_width="fill_parent" android:layout_height="wrap_content" android:text="@string/hello" /> <android.gesture.GestureOverlayView          -- this means the entire App's will have this OverlayView android:id="@+id/gestures" android:layout_width="fill_parent" android:layout_height="0dip" android:layout_weight="1.0" />


</LinearLayout>

 

Step 4: make your Activity:

  • Step 1: load the gestures you stored in res/raw as a GestureLibrary

  • Step 2: grab the GestureOverlayView from R.id.gestures above

  • Step 3: make this Activity be the listener for this GestureOverlayView

  • Step 4: implement a listener for OnGesturePerformedListener

 

public class Gesture_testActivity extends Activity implements OnGesturePerformedListener{
   
 GestureLibrary mLibrary;
 
 /** Called when the activity is first created. */
  @Override
  public void onCreate(Bundle savedInstanceState) {
     super.onCreate(savedInstanceState);
     setContentView(R.layout.main);
 
     mLibrary = GestureLibraries.fromRawResource(this, R.raw.gestures);
     if (!mLibrary.load()) {
           finish();
      }
 
     GestureOverlayView gestures = (GestureOverlayView) findViewById(R.id.gestures);
     gestures.addOnGesturePerformedListener(this);
  }
  //called automatically when Gesture is detected 
  @Override
   public void onGesturePerformed(GestureOverlayView overlay, Gesture gesture) {
   
        // this will use the currently loaded GestureLibrary and match the current gesture against
// it and come up with a 0 to many Predictions as to what gestures it may be   
ArrayList<Prediction> predictions = mLibrary.recognize(gesture);
      // We want at least one prediction
        if (predictions.size() > 0) {
             Prediction prediction = predictions.get(0);   //take the first prediction
             // We want at least some confidence in the result
             if (prediction.score > 1.0) {   // if at lease a score of > 1.0
                  // Show the gesture name that was detected
                  Toast.makeText(this, prediction.name, Toast.LENGTH_SHORT).show();  //display gesture name
             }
        }
 
   }


}

 


 

Creating Gesture in Code (example) rather than loading pre-defined gestures from file --- Good if you are doing just the standard Gestures

http://stackoverflow.com/questions/7535612/problem-with-swipe-fling-gesture-recognition-on-imageview-in-android

http://www.androidsnippets.com/gesturedetector-and-gesturedetectorongesturelistener

 

Example, screen shot where did a swipe from left to right ---- called a "Fling"

import android.app.Activity;
import android.os.Bundle;
import android.widget.LinearLayout;
import android.view.MotionEvent;
import android.view.GestureDetector.OnGestureListener;
import android.view.GestureDetector;
import android.widget.TextView;
import android.graphics.Color;

public class Gesture_SwipeCoded2 extends Activity implements OnGestureListener {    
    private LinearLayout main;    
    private TextView viewA;
   
    private GestureDetector gestureScanner;swipe detected
       
    @Override
    public void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
         
        gestureScanner = new GestureDetector(this);
         
        main = new LinearLayout(this);
        main.setBackgroundColor(Color.GRAY);
        main.setLayoutParams(new LinearLayout.LayoutParams(320,480));
         
        viewA = new TextView(this);
        viewA.setBackgroundColor(Color.YELLOW);
        viewA.setTextColor(Color.BLACK);
        viewA.setTextSize(16);
        viewA.setLayoutParams(new LinearLayout.LayoutParams(320,80));
        main.addView(viewA);
         
        setContentView(main);
    }
   
    @Override
    public boolean onTouchEvent(MotionEvent me) {
        return gestureScanner.onTouchEvent(me);
    }
   
    @Override
    public boolean onDown(MotionEvent e) {
        viewA.setText("-" + "DOWN" + "-");
        return true;
    }
   
    @Override
    public boolean onFling(MotionEvent e1, MotionEvent e2, float velocityX, float velocityY) {
        viewA.setText("-" + "FLING" + "-");
        return true;
    }
   
    @Override
    public void onLongPress(MotionEvent e) {
        viewA.setText("-" + "LONG PRESS" + "-");
    }
   
    @Override
    public boolean onScroll(MotionEvent e1, MotionEvent e2, float distanceX, float distanceY) {
        viewA.setText("-" + "SCROLL" + "-");
        return true;
    }
   
    @Override
    public void onShowPress(MotionEvent e) {
        viewA.setText("-" + "SHOW PRESS" + "-");
    }    
   
    @Override
    public boolean onSingleTapUp(MotionEvent e) {
        viewA.setText("-" + "SINGLE TAP UP" + "-");
        return true;
    }
}

 

 

 

Busy Layout --- how detect gestures ---gestures may span across view objects

GestureOverlayView. This allows you to have a transparent overlay on top of your background view(s) and can be used to detect things like 'swipe' without your underlying views needing to know about gestures.

IN LAYOUT.xml file add the following


<android.gesture.GestureOverlayView
    android:id="@+id/gestures"
    android:layout_width="fill_parent"
    android:layout_height="0dip"
    android:layout_weight="1.0" />

A gesture overlay acts as a simple drawing board on which the user can draw his gestures. You can tweak several visual properties, like the color and the width of the stroke used to draw gestures, and register various listeners to follow what the user is doing. The most commonly used listener is GestureOverlayView.OnGesturePerformedListener, which fires whenever a user is done drawing a gesture:

 

 

 


DELIVERABLE: Turn in to BB->Exercises->Exercise 11.1 the URL to your Youtube showing the app you created in Gesture EXERCISE. See info on how to do this.

 

cs453:mobile programming

  • home
  • outline
  • projects
  • syllabus
  • links