Modern Android apps often make use of gesture interaction to provide fluid, natural interaction with the app. There are a few ways to handle these interactions, and in this post I’m going to cover some of the basics for easily adding gesture support to your app.
The code for this sample app can be found at Github.
The basic building blocks of this simple gesture handling code is the GestureDetector, and the various OnGestureListeners. These classes allow you to easily detect and handle certain gestures, whithout having to worry about tracking touch events and determine the math to decide what type of gesture it is.
The Gesture Detector
The GestureDetector is a class that monitors touch events to determine when they adhere to a specified gesture, and converts those touches into meaningful data. In the sample code, we’re going to be using a ScaleGestureDetector in particular, but they generally all work the same.
The basic idea of a GestureDetector is to read your view’s MotionEvents, which you’ve captured by either overriding the
onTouchEvent(MotionEvent event) method or by adding a View.OnTouchListener to your view, and passing data interpereted from those touches to a Listener if they match the desired gesture.
The Gesture Listener
The OnGestureListener is called whenever the gesture has been triggered, to allow you to handle that gesture however you see fit. By the time the OnGestureListener is called, the gesture has been translated into meaningful data, like a scroll delta or a scale factor, so you don’t need to do the math and touch tracking yourself.
This makes it very easy to add simple gesture support to your app, without having to track touches and deltas yourself. Let’s take a look at an example.
In our example, we’ll be creating simple app that displays an image, and allows the user to scale the image using the standard pinch gesture.
To accomplish this, we’ll be creating a very simple
ImageView subclass, called
TouchImageView, and overriding the
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59
As you can see in the code above, our custom ImageView instantiates a
ScaleGestureDetector and implements the
ScaleGestureDetector.OnScaleGestureListener callbacks itself.
The important piece which triggers the gesture detector is the call to
mScaleGestureDetector.onTouchEvent(event), which returns true if the gesture is satisfied and the detector will be handling the event.
1 2 3 4 5 6 7 8 9 10 11 12
If the gesture detector doesn’t handle the event, then we pass it through to the super class to handle.
When the events constitute a scale gesture (pinch to zoom), the gesture detector will convert that gesture into a scale factor, accessible via
detector.getScaleFactor(), and pass it to the
onScale(ScaleGestureDetector) method of the Listener.
1 2 3 4 5 6 7 8 9 10 11
This is where we simply set the scale of image using the image matrix. Most apps would likely do more here, like checking that the scale factor remains within a certain bounds so that the image doesn’t get scaled down to nothing, but that’s left as an exercise for the reader.
That’s all there is to this custom view. With these minor additions, and the help of the ScaleGestureDetector, we were able to add pinch to zoom to a simple ImageView with minimal code.
Extending the gesture support is also easy by simply adding more GestureDetectors. In Fragment, for instance, we use 3 different GestureDetectors (Translate, Scale, Rotate) on the same view to achieve the smooth, gesture based editing.
You can also check out the Android Gesture Detectors project on Github for some more Gesture Detectors that work with multi touch gestures. (I didn’t write that library, just converted it to Gradle and deployed it to Maven Central. I’ve got a PR out to the owner to merge those in and upload to his own Maven Central groupId.)