A UIEvent
object (or, simply, an event object) represents an event in iOS. There are three general types of event: touch events, motion events, and remote-control events. Remote-control events allow a responder object to receive commands from an external accessory or headset so that it can manage manage audio and video—for example, playing a video or skipping to the next audio track. Motion events were introduced in iOS 3.0 and remote-control events in iOS 4.0.
A touch type of event object contains one or more touches (that is, finger gestures on the screen) that have some relation to the event. A touch is represented by a object. When a touch event occurs, the system routes it to the appropriate responder and passes in the
UIEvent
object in a message invoking a UIResponder
method such as . The responder can then evaluate the touches for the event or for a particular phase of the event and handle them appropriately. The methods of
UIEvent
allow you to obtain all touches for the event () or only those for a given view or window (
or
). It can also distinguish an event object from objects representing other events by querying an object for the time of its creation (
).
A UIEvent
object representing a touch event is persistent throughout a multi-touch sequence; UIKit reuses the same UIEvent
instance for every event delivered to the application. You should never retain an event object or any object returned from an event object. If you need to keep information from an event around from one phase to another, you should copy that information from the or
UIEvent
object.
You can obtain event types and subtypes from the and
properties.
UIEvent
defines event types for touch, motion, and remote-control events. It also defines a motion subtype for "shake” events and a series of subtype constants for remote-control events, such as “play” and “previous track.” The first responder or any responder in the responder chain implements the motion-related methods of UIResponder
(such as ) to handle shaking-motion events. To handle remote-control events, a responder object must implement the
method of
UIResponder
.
The method, which was introduced in iOS 3.2, allows you to query a gesture-recognizer object (an instance of a subclass of
UIGestureRecognizer
) for the touches it is currently handling.
Getting the Touches for an Event
Getting Event Attributes
property
Getting the Event Type
property
property
Getting the Touches for a Gesture Recognizer
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { NSMutableSet *currentTouches = [[[event touchesForView:self] mutableCopy] autorelease]; [currentTouches minusSet:touches]; if ([currentTouches count] > 0) { [self updateOriginalTransformForTouches:currentTouches]; [self cacheBeginPointForTouches:currentTouches]; } [self cacheBeginPointForTouches:touches];}- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { CGAffineTransform incrementalTransform = [self incrementalTransformWithTouches:[event touchesForView:self]]; self.transform = CGAffineTransformConcat(originalTransform, incrementalTransform);}- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { for (UITouch *touch in touches) { if (touch.tapCount >= 2) { [self.superview bringSubviewToFront:self]; } } [self updateOriginalTransformForTouches:[event touchesForView:self]]; [self removeTouchesFromCache:touches]; NSMutableSet *remainingTouches = [[[event touchesForView:self] mutableCopy] autorelease]; [remainingTouches minusSet:touches]; [self cacheBeginPointForTouches:remainingTouches];}- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event { [self touchesEnded:touches withEvent:event];}
In the example project application delegate the code indicates (and I've read elsewhere) that the touch event object passed to touchesBegan, touchesMoved, and touchesEnded will be the same object while it is still a single set of user actions, such as touching and moving a finger. When I override UIScrollView and implement these methods, the events that I get back are different objects. What am I missing here?
You are right that the UIEvent
is reused when delivering touch events for one gesture. From the :
A UIEvent object representing a touch event is persistent throughout a multi-touch sequence; UIKit reuses the same UIEvent instance for every event delivered to the application. You should never retain an event object or any object returned from an event object. If you need to keep information from an event around from one phase to another, you should copy that information from the UITouch or UIEvent object.
I presume the difference in behavior for your case results from the special event handling done byUIScrollView
. Scroll views delay event delivery because they need to detect a scrolling intent by the user (swipe gestures). So they have to have a way of keeping UIEvents around—probably copying them to make sure they retain their original state. This might be the reason you see different objects.
Note that all of the above is only guessing.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { [[self superview] bringSubviewToFront:self]; NSMutableSet *currentTouches = [[[event touchesForView:self] mutableCopy] autorelease]; [currentTouches minusSet:touches]; if ([currentTouches count] > 0) { [self updateOriginalTransformForTouches:currentTouches]; [self cacheBeginPointForTouches:currentTouches]; } [self cacheBeginPointForTouches:touches];}