Creating custom gesture recognisers for iOS

A custom gesture recognisers allows you to simplify the job of detecting specific touch patterns on a iPhone or iPad. Although some basic gestures (long tap, double tap, pinching and zooming, and so on) are already available from Apple out of the box, more complex finger movements must be detected by yourselves. By implementing your detection logic as a gesture recogniser, you make that logic much more easily reusable within UIKit. So here’s how to do it.

Subclass UIGestureRecognizer in your project

Create a new Objective C class in XCode, making sure you extend the existing UIGestureRecognizer class from UIKit.
In the header file, add an import statement to UIKit/UIGestureRecognizerSubclass.h, a category of UIGestureRecognizer that makes some private features of the class available to subclasses (for example, it makes the state property read/write as opposed to read-only).

Make all of the required methods available in the implementation (.m) file. You do not need to declare them in the header (.h) as these methods are inherited from the base class. When overriding these methods, you must also call the implementation of the superclass. This is the code you need

How a gesture recogniser works

All of the work is done in the methods you have overridden above. They behave in exactly the same way as the corresponding methods on a UIView or UIViewController instance. Your job here is to detect the gesture you want and update the state of your recogniser accordingly. Everything else (forwarding touches to the views, invoking selectors on targets, etc.) is handled by the base class.
The state property is clearly a very important element. It is of type UIGestureRecognizerState, an enumeration declared like this

All recognisers start in the UIGestureRecognizerStatePossible state. Then there are two types of gestures: discrete and continuous.

Discrete gestures

The former is a gesture that is either performed or it isn’t. If the right moves are made on the touch screen, the gesture is recognized (UIGestureRecognizerStateRecognized) and the delegates are invoked. Otherwise, as soon as an unexpected move is detected, the recogniser fails (UIGestureRecognizerStateFailed).

Continuous gestures

Continuous gestures instead can start in a certain way (and when this is detected the state is set to UIGestureRecognizerStateBegan), but can then proceed in a similar or different way: when each new part of the gesture is detected, the state is set to UIGestureRecognizerStateChanged and the delegates are continuously notified of the progression of the gesture. If an unexpected move is made at any point, the state is set to UIGestureRecognizerStateCancelled, or, if an acceptable conclusive move is made, to UIGestureRecognizerStateEnded (same as UIGestureRecognizerStateRecognized, but Apple docs seem to suggest that this is the preferred term for continuous gestures).

The reset method

When the state of your recogniser instance is set to UIGestureRecognizerStateEnded (or UIGestureRecognizerStateRecognized, as we saw they are practically synonymous), the base class notifies the delegates and the it invokes the reset method, just before resetting the state to UIGestureRecognizerStatePossible. In this method you should reset the values of any instance variables, ready for detection again when new touches begin on the screen.

Circle detection: an example

The reason for looking into gesture recognisers in the first place was because I needed a way of detecting small circular movements on the screen for my Tube Finder app. So here’s how I implemented a circle gesture recogniser based on code by Jeff Lamarche.

The interface

Internally the recogniser holds an array of points, the coordinates of the first touch that triggered recognition, and the timestamp of that first touch. It then exposes a series of properties that allow to tweak the way the circle detection algorithm works: I’ll leave these to Jeff Lamarche’s post. Three more read-only properties expose all of the points that have been touch in this gesture and – for when the gesture is complete – the radius and centre of the circle.

As far as the implementation goes, here’s the important bits (the rest you can check out in the attached source code).

The implementation

We set up all of the initial values in the -init method.

Similarly in the -reset method, which is invoked after a gesture has been detected or has failed (but not, disappointingly, before the recogniser is used for the first time).

And then the detection itself. When a finger touches the screen we record the timestamp and the point of the touch (-touchesBegan:withEvent: method). As the finger moves across the screen we save the various points it goes over (-touchesMoved:withEvent: method). The actual detection is done as the finger is lifted for the screen (-touchesEnded:withEvent: method), where we evaluate how long it has taken to draw the circle (if too long it’s no good) and the placement of the various points.

The project

A sample project and the source code for the gesture recogniser are available on github at /fmestrone/Circle-Detection-for-iOS.

You can follow any responses to this entry through the RSS 2.0 feed. You can leave a response, or trackback from your own site.

1 Comment »

 
 

Leave a Reply

XHTML: You can use these tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

*