Presentation is loading. Please wait.

Presentation is loading. Please wait.

Other Input Methods Pre-Lab Lecture 4 1. Revisit 2  Pre-Lab 3 – Animation  Boundary Information  Layer Concept  Animation algorithm  Next Position.

Similar presentations


Presentation on theme: "Other Input Methods Pre-Lab Lecture 4 1. Revisit 2  Pre-Lab 3 – Animation  Boundary Information  Layer Concept  Animation algorithm  Next Position."— Presentation transcript:

1 Other Input Methods Pre-Lab Lecture 4 1

2 Revisit 2  Pre-Lab 3 – Animation  Boundary Information  Layer Concept  Animation algorithm  Next Position Calculation  Data and Image object  Moving Out of Boundary  Lab 3  Shooting out a list of Bullets  Shrinking the Time Left Bar  Generating the target balloon and moving down

3 Overview of Lab 4 3 INPUT PROCESSOUTPUT Touch on Screen Tilt Device Tutorial 4 Part 1 Part 2 Part 3

4 Input method 4  Now, we can control  The movement of the user Image through the Left or Right Button  The Bullets Shooting through the Shoot Button  In fact, iPhone is well-known of the following two kinds of input method which will be discussed in this Pre-Lab Lecture  Touches and Gestures Detection  Device Tilting

5 Touch Event vs Action on UI Component 5  Similarity  Both of them involve the finger touches on the screen  Differences  If the finger touches on the screen which has a UI component, and this UI component can apply action, then this becomes an action to the UI component  e.g., Press a button activates the “Touch Up Inside”action on the left button in Tutorial 1 Part 3.  Otherwise, this becomes a touch event to the screen view  i.e., We can only know some basic information like  The point touches by the finger on the screen view  How many points currently touched on the screen view  By interpreting these basic information, some sophisticated touches actions and gestures can be detected.

6 Touches Actions and Gestures 6  Touches Actions and Gestures  Tapping  Single Tap – touch a single point of the screen once  Double Tap – touch a single point of the screen twice  Multitouch  Touch several points on the screen simultaneously  Dragging  Touch on a certain UI component and move the center of the UI component  Swipping  Moving on the screen to the right or left to represent next page/prev  Zooming In or Out  Moving two fingers towards or outwards to represent zoom in or zoom out gesture

7 Tapping – Single Tap 7 Screen View Touch point on screen Touch a Single Point on the Screen

8 Tapping – Double Taps and Multi Taps 8 Screen View First Touch point on screen Second Touch point on screen Double Taps: Touch a Single Point on the Screen twice within a short period of time Multiple Taps: Taps more than twice are possible to be detected

9 MultiTouches 9 Screen View Touch point 1 MultiTouches: It is possible to detect more than one touch on the screen simultaneously Touch point 2

10 Dragging 10 Screen View Dragging: Touches on a certain UI component (cannot apply action type) and move to other place on the screen together with the UI component

11 Swipping 11 Screen View Swipping: Touches on a certain point on the screen and move to other place Usually, moving right is used to represent the gesture of next page, and moving left is used to represent the gesture of previous page

12 Zooming in/out 12 Screen View Zooming in: Two fingers touch on the screen simultaneously and move towards each other. This is to represent the gesture that we would like to zoom in to look more detail on the image Zooming out: Two fingers touch on the screen simultaneously and move outwards

13 Basic Touches Event Handlers 13  Traditionally, iPhone SDK does not provide any method for interpreting the touches actions and gestures  They only provide three basic methods for handling three different stages of touches events  Touches Began  This method will be invoked ONCE when the finger first touches on the screen every time  Touches Moved  This method will be invoked CONTINUOUS when the finger moves on the screen  Touches Ended  This method will be invoked ONCE when the finger leaves the screen

14 Part 1A - Touch Events Test 14  Basically, the iPhone app can handle three basic touch events when you implement the following methods inside the view controller  TouchesBegan Method  TouchesMoved Method  TouchesEnded Method  We will then discuss how to interpret several common touches actions from these touches events  Single Tap on Screen  Moving on the Screen  Double Taps on the Screen  Two Touches on the Screen

15 Basic Touches Method 15 // Touches Began Method -(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{ NSLog(@”Touches Began”); } // Touches Moved Method -(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{ NSLog(@”Touches Moved”); } // Touches Ended Method -(void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{ NSLog(@”Touches Ended”); }

16 Example Situation 1 – Single Tap on Screen 16 TouchesBegan TouchesEnded Finger touches the screen Finger leaves the screen

17 Example Situation 2 – Moving on the screen 17 TouchesBegan TouchesEnded Finger touches the screen Finger leaves the screen TouchesMoved Finger moves on the screen

18 Information provided by a touch event 18  Within each touch event method, you can request it to give you some more detail information  How many touches currently identified on the screen  The touch point of each touch on the screen  How many times a certain point is touch consecutively

19 Example: Touches Began Event 19 -(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{ // By invoking the allTouches method of event, which is provided by the touches method (i.e., touches began method in this case), a set of touches will be returned. // Note that the return type is NSSet, and it is just a convention of using allTouches as a variable to hold the set. You can definitely use our variable name.  NSSet * allTouches = [event allTouches]; // By invoking allObjects method of the allTouches, we will then get an NSArray holding the UITouch object // Note that UITouch can be regarded as the object which actually holds a specific touches information on the screen, e.g., the point that this touch occurs. In the following code, we are interested in the first object of the array.  UITouch * touch = [[allTouches allObjects] objectAtIndex:0];

20 Information that we may interest in 20 // Touch Point Location Information // By invoking the locationInView method of the UITouch, we can get the actual touch point location of this touch on our current screen view by using our current screen view as the input parameter, i.e., [self view]  CGPoint touchPoint = [touch locationInView:[self view]]; // Number of touches simultaneously on screen // Recall that allTouches is the variable holding the set of touches on the screen. // We can ask for how many touches occur on the screen by asking for its size directly  [allTouches count]; // Number of Consecutive Touches on a single point // Recall that touch is a variable referring to a specific touch information on the screen // We can invoke its tapCount method to get this information  [touch tapCount];

21 Example Situation 3 – Double Taps or Multi Taps 21  NSSet * allTouches = [event allTouches];  UITouch * touch = [[allTouches allObjects] objectAtIndex:0];  switch ([touch tapCount]){  case 2:{  NSLog(@”Touch a point 2 times screen”);  CGPoint touchPoint = [touch locationInView:[self view]];  NSLog(@”x: %f, y: %f”, touchPoint.x, touchPoint.y);  } }

22 Example Situation 4 – Two or More Touches on Screen 22  NSSet * allTouches = [event allTouches];  switch([allTouches count]){  case 2:{  NSLog(@”touch 2 points on screen”);  UITouch * touch1 = [[allTouches allObjects] objectAtIndex:0];  CGPoint touchPoint1 = [touch1 locationInView:[self view]];  NSLog(@”x: %f, y: %f”, touchPoint1.x, touchPoint1.y);  UITouch * touch2 = [[allTouches allObjects] objectAtIndex:1];  CGPoint touchPoint2 = [touch2 locationInView:[self view]];  NSLog(@”x: %f, y: %f”, touchPoint2.x, touchPoint2.y);  }

23 Discussion on Multi-Touches Case 23  Please note that the order of the touch points is not fixed  Refer to the previous example,  (a) First touch point is “x:204, y:109”  (b) Second touch point is “x:276. y:211”  However, even if you touch the two same points again, it is possible for (a) to be second touch point, and (b) to be the first touch point.  i.e., the order is reverse

24 Simulating two touch points in iPhone Simulator 24  In iPhone simulator, single touch point is used by default.  To simulate two points, you can press OPTION when you move the mouse pointer on the screen view.

25 Part 1B - Double Taps to Shoot 25  Hints:  Now, when you press the shoot button, the bullet will be shot To support the new function  Add in a method “fireBullet” to handle all the bullet shooting situations. Invoke the method when  Press the Shoot Button  Double Taps on the screen

26 Part 2 – Touch Moved and Dragging Practice 26  Objective:  Allow the shooter to be dragged horizontally when the user touches on the image.  The orientation of the image can be changed by swiping outside the image.  Algorithm:  Detect the touch move position on screen  Check whether the touch falls on the image or not  If true  Handle the userImage move situation  If false  Handle the angle change situation  Problem:  Why Implement the function in TouchesMoved?  How to distinguish whether your finger falls on the image or not?  How to calculate the angle moved

27 Image Touch Detection Technique 27  Recall that to detect the touch location on the screen view  UITouch * touch = [[allTouches allObjects] objectAtIndex:0];  CGPoint touchPT = [touch locationInView:[self view]];  To check whether your finger touch falls on the image  touchPT.x > x1 and touchPT.x < x2  touchPT.y > y1 and touchPT.y <y2  Note that x2 = x1+width, y2 = y1+height (x1, y1) (x2, y1) (x1, y2) (x2, y2)

28 Angle Calculation Technique 28 P2 (x, y) P3 (x, y - 10) Artificial Point User Image Center Point Finger Touch Point P1 (x’, y’) Angle Can be Calculated For your simplicity, we implemented AngleCalculator.h and AngleCalculator.m Method find_angle returns the angle at p2 of the triangle formed by the three points by taking CGPoints p1, p2, p3 as input parameters. To make a CGPoint, you can use CGPointMake(float x, float y).

29 Image Angle Rotation 29  To change the angle of the userImage  We can rotate the userImage to the appropriate direction  // Rotate the view  CGAffineTransform transform = CGAffineTransformMakeRotation( ); .transform = transform;  Note that the angle of rotation refers to the angle start from the 0 no matter what the current angle of rotation is 0 0 angle of rotation

30 Input Method – Device Tilting 30  iPhone allows the application to detect device tilting through the taking of the accelerometer reading periodically  They provide accelerometer reading in three directions  X, Y, and Z  To simplify the case, we will only explain X and Y in the Pre-Lab  Accelerometer reading is bounded by (-1 < 0 < 1) depends on the degree you tilt the device  i.e., the larger the degree you tilt, the larger will be its absolute value. The maximum absolute value will be 1  We can use this to act as an another kind of input method to our application

31 Horizontal Position 31  This is the position in which the device is horizontally put on the desk  In this case,  X = 0, Y = 0  Suppose we rotate the device right

32 Right Rotation Position 32  This is the position in which the device is rotated to right on the desk  In this case,  X = 1, Y = 0  Suppose we rotate the device left this time

33 Left Rotation Position 33  This is the position in which the device is rotated to left on the desk  In this case,  X = -1, Y = 0  Suppose we rotate the device to vertical up position

34 Vertical Up Position 34  This is the position in which the device is rotated to vertical up position  In this case,  X = 0, Y = -1  Suppose we rotate the device to vertical upside down this time

35 Vertical Upside Down Position 35  This is the position in which the device is rotated to vertical upside down position  In this case,  X = 0, Y = 1

36 To conclude 36 +x -x -y +y

37 Recall: Screen View Orientation and Coordinate Representation 37 (0, 0) (480, 320) (480, 0) (0, 320) Landscape Mode (0, 0) (320, 0) (320, 480)(0, 480) Protriat Mode Coordinate System Changes when it changes from Protrait mode to landscape mode

38 Accelerometer Reading 38  However, this representation will not change even the orientation of the screen change +y +x Landscape Mode Home Button Protrait Mode +x -y -x +y -y -x

39 Part 3A - Detecting Device Tilting Event (UIAccelerometer Event) 39  Implement UIAccelerometer Delegate Protocol by changing the header of BallShootingViewController to  // This is to show that this view controller has implemented a method to handle the Accelerometer reading  @interface BallShootingViewController : UIViewController  Initialize the Accelerometer Reading with the update Interval by writing method initializationAccelerometer. Make sure you call this method in ViewDidLoad.  // Reset the Accelerometer  [[UIAccelerometer sharedAccelerometer] setDelegate:nil];  // Set the Accelerometer to take the reading every DEFAULT_TIMER_RATE second  [[UIAccelerometer sharedAccelerometer] setUpdateInterval:DEFAULT_TIMER_RATE];  // This is to tell the Accelerometer that the current view controller has implemented a method in handling when the reading is ready  [[UIAccelerometer sharedAccelerometer] setDelegate:self];

40 Part 3A - Detecting Device Tilting Event (UIAccelerometer Event) II 40  Implement the didAccelerate method in the view controller which will be invoked when accelerometer reading is ready  - (void) accelerometer:(UIAccelerometer *)accelerometer didAccelerate:(UIAcceleration *)acceleration{  // You can get the corresponding X, Y, Z reading by looking for acceleration.x, acceleration.y, and acceleration.z in the didAccelerate mehtod  NSLog(@"Accel x: %f, Accel y: %f, Accel z: %f", acceleration.x, acceleration.y, acceleration.z);  }  Note that you must load your application to the real device to test the results.

41 Part 3B – Tilting Device to Move User Image 41  In part 3B, we would like the student to implement the function  Tilt device to right hand side in landscape mode to move userImage to right  Tilt device to left hand side in landscape mode to move userImage to left  Hints:  Applying appropriate offset (obtained from the accelerometer Reading) to the center position of the userImage  Careful handling the Accelerometer Reading when tilting the device


Download ppt "Other Input Methods Pre-Lab Lecture 4 1. Revisit 2  Pre-Lab 3 – Animation  Boundary Information  Layer Concept  Animation algorithm  Next Position."

Similar presentations


Ads by Google