Download presentation
Presentation is loading. Please wait.
Published byAshlyn Lloyd Modified over 8 years ago
1
CHAPTER 7 TouchGestures
2
Chapter objectives: To code the detection of and response to touch gestures. Patterns of common touches to create touch gestures that can be interpreted by an Android device. The use of MotionEvents The differences between touch events and motion events How to build applications using multi-touch gestures
3
7.1 Touchscreens Most Android devices are equipped with a capacitive touchscreen They rely on the electrical properties of the human body to detect when and where on a display the user is touching Capacitive technology provides numerous design opportunities for Android developers
4
7.2 Touch Gestures Gestures are the primary way in which users interact with most Android devices Touch gestures represent a fundamental form of communication with an Android device A touch gesture is an action, typically a movement of a user’s finger on a touchscreen
5
7.3 The Basics of Touch Events
6
The MotionEvent class provides a collection of methods to report on the properties of a given touch gesture Motion events describe movements in terms of an action code and a set of axis values Each action code specifies a state change produced by a touch occurrence, such as a pointer going down or up The axis values describe the position and movement properties.
7
three basic MotionEvents can be combined to create touch gestures ACTION_DOWN ACTION_MOVE ACTION_UP
8
7.4 Gesture Detector The easiest approach is to use the GestureDetector class for detecting specific touch gestures. This employs the OnGestureListener to signal when a gesture occurs and then pass the triggered motion event to the GestureDetector’s onTouchEvent() method The onTouchEvent() method will determine exactly what action patterns are occurring on the screen The scroll events produced by a fling gesture can be used to provide information about the velocity of a moving finger and the distance it has traveled on the screen A completed fling gesture is performed by the execution of the following callback methods: onDown(), onScroll(), and onFling().
9
GestureDetector has limited usage for Android application because it is not able to handle all types of gestures The onDown() method is automatically called when a tap occurs with the down MotionEvent that triggered it The onLongPress() is called when a long press occurs with the initial onDown MotionEvent that triggered it The onSingleTapUp() callback will occur when a tap gesture takes place with an up motion that triggered it
10
Similar to the onSingleTapUp() callback, onSingleTapConfirmed() will occur when a detected tap gesture is confirmed by the system as a single tap and not part of a double tap gesture
12
7.5 The MotionEvent Class GestureDetector class allows basic detections for common gestures This type of gesture detection is suitable for applications that require simple gestures The GestureDetector class is not designed for handling complicated gestures. A more sophisticated form of gesture detection is to register an OnTouchListener event handler to a specific View, such as a graphic object on stage that can be dragged To provide touch event notification, the onTouchEvent() method can be overridden for an Activity or touchable View
13
The MotionEvent class provides a collection of methods to query the position and other properties of fingers used in a gesture getX():Returns the x axis coordinate value at the finger’s location on the screen getY():Returns the y axis coordinate value at the finger’s location on the screen getDownTime():Returns the time when the user initially pressed down to begin a series of events getPrecisionX(): Returns the precision of the X coordinate getAction():Returns the type of action being performed by the user, such as ACTION_DOWN
14
7.6 The Drag and Drop Gesture drag-and-drop is the action of tapping on a virtual object and dragging it to a different location Drag-and-drop is a gesture that is most often associated with methods of data transfer This gesture assumes a drag source and a drop target exist
15
The Android framework for drag-and-drop includes a drag event class, drag listeners, and helper methods and classes Source and target containers must be created to hold the View elements that will be dragged and eventually dropped As Figure 7- 13 shows, a basic drag-and- drop design relies on at least two Views
17
All elements that are the intended moveable objects in a drag-and-drop process must be registered with an appropriate listener event A touch listener, setOnTouchListener(), must be attached to each draggable View This will register a callback to be invoked when an explicit touch event is sent to this draggable View object A view container that functions as a source or target container must be registered with an explicit “on drag” listener event.
18
During a drag-and-drop operation, the system provides a separate image that the user drags For data movement, this image represents a copy of the object being dragged This mechanism makes it clear to the user that an object is in the process of being dragged and has not yet been placed in its final target location This dragged image is called a drag shadow because it is a shadow version of itself
22
7.7 Fling Gesture A fling is a core touchscreen gesture that is also known as a swipe A fling is a quick swiping movement of a finger across a touchscreen As with other gestures, the MotionEvent object can be used to report a fling event The motion events will describe the fling movement in terms of an action code and a set of axis values describing movement properties
23
For common gestures, such as onFling, Android provides the GestureDetector class that can be used in tandem with the onTouchEvent callback method An application Activity can implement the GestureDetector.OnGestureListener interface to identify when a specific touch event has occurred Once these events are received they can be handed off to the overriden onTouchEvent callback
24
7.8 Fling Velocity Tracking the movement in a fling gesture requires the start and end positions of the finger, and the velocity of the movement across the touchscreen The direction of a fling can be determined by the x and y coordinates captured by the ACTION_DOWN and ACTION_UP MotionEvents and the resulting velocities
25
VelocityTracker is an Android helper class for tracking the velocity of touch events, including the implementation of a fling In applications such as games, a fling is a movment-based gesture that produces behavior based on the distance and direction a finger travels with a gesture The VelocityTracker class is designed to simplify velocity calculations in movement- based gestures
26
7.9 Multi-Touch Gestures A pinch gesture involves two fingers placed on the screen The finger positions, and the distance between them, are recorded When the fingers are lifted from the screen, the distance separating them is recorded If the second recorded distance is less than the first distance, the gesture is recognized as a pinch
27
A spread gesture is similar to a pinch gesture, in that it also records the start and end distance between the fingers on the touchscreen If the second recorded distance is greater than the first, it is a spread gesture.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.