TUTORIAL ON MULTITOUCH AND SWIPE GESTURES

Slides:



Advertisements
Similar presentations
C Language.
Advertisements

Graphics with Canvas, SurfaceView, and multitouch processing (panning and multitouch zoom) GraphicsWithCanvas_2012.pptx.
Cosc 5/4730 Input Keyboard, touch, and Accelerometer.
ANDROID – GESTURES L. Grewe. What and why  a great way to interact with applications on mobile devices.  With a touch screen, users can easily tap,
Basic 2D Graphics in Android. Android Graphics Programming There are many ways to do graphics programming in Android – 2D vs. 3D – static vs. dynamic.
Touches Detection and Other Remaining Parts Lecture 3 1.
Cosc 5/4730 Input Keyboard, touch, and Accelerometer.
Touch & Gestures.  MotionEvents  Touch Handling  Gestures.
Spring /6.831 User Interface Design and Implementation1 Lecture 26: Mobile User Interfaces.
Data Storage: Part 1 (Preferences)
CS378 - Mobile Computing What's Next?. Fragments Added in Android 3.0, a release aimed at tablets A fragment is a portion of the UI in an Activity multiple.
Cosc 5/4730 Input Keyboard, touch, and Accelerometer.
1 Mobile Computing Advanced Touching Copyright 2014 by Janson Industries Assg Part1Assg Part1 AssgPart2AssgPart2.
Biometric System Design for Handheld Devices Team 4 Naif Alotaibi, Rich Barilla, Francisco Betances, Aditya Chohan, Alexandra Garcia, Alexander Gazarov,
Linear Layout, Screen Support, and Events. Linear Layout Supports 2 orientations: 1.Horizontal 2.Vertical I often get confused with how each orientation.
@2011 Mihail L. Sichitiu1 Android Introduction GUI Menu Many thanks to Jun Bum Lim for his help with this tutorial.
CS378 - Mobile Computing More UI - Part 2. Special Menus Two special application menus – options menu – context menu Options menu replaced by action bar.
로봇을 조종하자 3/4 UNIT 17 로봇 SW 콘텐츠 교육원 조용수. 학습 목표 스마트 폰의 센서를 사용할 수 있다. 2.
Resources and RelativeLayouts. Resources Android Resources We’ve already talked about the different types of Android Resources DirectoryResource Type.
Custom Widget 1 UNIT 26 로봇 SW 콘텐츠 교육원 조용수. 캔버스 public void drawColor(int color) 2 public class ControllerView extends View { public ControllerView(Context.
Mobile Programming Lecture 12 HierarchyViewer, Linkify, Gestures, and Version Control.
Android Boot Camp Demo Application – Part 1. Development Environment Set Up Download and install Java Development Kit (JDK) Download and unzip Android.
Mobile Programming Lecture 11 Animation and TraceView.
Applications with Multiple Activities. Most applications will have more than one activity. The main activity is started when the application is started.
Android View Stuff. TextViews Display text Display images???
Electrical and Computer Engineer Large Portable Projected Peripheral Touchscreen Team Jackson Brian Gosselin Jr. Greg Langlois Nick Jacek Dmitry Kovalenkov.
CS378 - Mobile Computing More UI - Part 2. Special Menus Two special application menus – options menu – context menu Options menu replaced by action bar.
로봇을 조종하자 1/5 UNIT 14 로봇 SW 콘텐츠 교육원 조용수. 학습 목표 터치 이벤트를 처리할 수 있다. 2.
Custom Widget 2 UNIT 27 로봇 SW 콘텐츠 교육원 조용수. 학습 목표 Custom Widget –Canvas 를 이용하여 Custom Widget 을 만든다. 2.
Android Alert Dialog. Alert Dialog Place Button to open the dialog. public class MainActivity extends ActionBarActivity { private static Button button_sbm;
Lecture 4: Sensors Topics: Motion, Position, and Environmental Sensors Date: Feb 11, 2016.
Events. Slide 2©SoftMoore Consulting Events Events are generated when a user interacts with the view objects of an application. Examples –button clicked–
Basic 2D Graphics in Android. Android Graphics Programming There are many ways to do graphics programming in Android – 2D vs. 3D – static vs. dynamic.
Chapter 7 Touch Gestures. Figure 07.01: Common touch gestures for Android devices.
CHAPTER 7 TouchGestures. Chapter objectives: To code the detection of and response to touch gestures. Patterns of common touches to create touch gestures.
User Interaction Radan Ganchev Astea Solutions. Content Basic input events Gestures Drag and drop Sensors.
The Doodlz app enables you to paint by dragging one or more fingers across the screen. The app provides options for setting the drawing color.
Android Android Sensors Android Sensors: – Accelerometer – Gravity sensor – Linear Acceleration sensor – Magnetic Field sensor – Orientation.
CS371m - Mobile Computing Gestures. Common Gestures 2.
Sensors in Android.
Touch and Go: Leading Touch UI with Open Source
CS 134 Alternate Inputs.
Fluency with Information Technology
GUI Programming Fundamentals
Linear Layout, Screen Support, and Events
Android – Event Handling
Creation of an Android App By Keith Lynn
Lesson 1: Buttons and Events – 12/18
Mobile Computing With Android ACST 4550 Bitmaps, Fonts and Gestures
Android 16: Input Events Kirk Scott.
Mobile Computing With Android ACST 4550 Android Logs and Gestures
Android Programming Lecture 6
CS 106A, Lecture 14 Events and Instance Variables
CIS 470 Mobile App Development
Cannon Game App Android How to Program
CMPE419 Mobile Application Development
Chapter 8: Graphics, Animations, Sounds, and Gaming
The Implementation of a Glove-Based User Interface
1/10/2019 JavaFX Events COSC 330.
CS371m - Mobile Computing Gestures.
Android Developer Fundamentals V2
Chapter 7: Touches and Swipes
Android Topics Asynchronous Callsbacks
Activities and Intents
Android Topics Sensors Accelerometer and the Coordinate System
滑動 建國科技大學 資管系 饒瑞佶.
Interactive Graphics in Android
Mobile Programming Gestures in Android.
Mobile Programming Dr. Mohsin Ali Memon.
CIS 470 Mobile App Development
Presentation transcript:

TUTORIAL ON MULTITOUCH AND SWIPE GESTURES UNIVERSITY OF TEXAS AT EL PASO PRESENTED BY: Kehinde Akinola

Objectives Know some Gestures in Android How to detect Common Gestures (using Android as examples) Creating multitouch gestures Know what a swipe Gesture is Handling Multi-Touch Gestures (In Android)/Tracking Multiple Pointers Get a MotionEvent's Action

Introduction Definition Gestures ? Gesture recognition is the mathematical interpretation of a human motion by a computing device. Computing Device: Examples >> Smartphone devices, iPhones, 3-D virtual reality, reality, etc. Gestures examples: >>>>>>pinch , double tap, scrolls , long presses, on a touch screen device… Multitouch gestures on Touchscreens …… Android devices **************Android provides GestureDetector class********** Libraries:

Basics of Working with Gestures (In Android) Capture touch Events Process the touch Events At the heart of all gestures is the onTouchListener and the onTouch() method which has access to MotionEvent data. Every view has an onTouchListener which can be specified Important Classes (libraries) to import: (In Android) import android.view.MotionEvent; import android.gesture.Gesture; import static android.view.gestureDetector.*;

ACTION_MOVE—A change has happened during a press gesture. Many mobile devices have touch screens and with touch screen you can have gestures. Definition: Multitouch gesture is when multiple pointers (fingers) touch the screen at the same time. Examples: Use 2 fingers to rotate an object. Use 2 fingers to magnify an image Touch events generated for multiple pointers in Android: Track Multiple Pointers (in Android) ACTION_DOWN—For the first pointer that touches the screen. This starts the gesture. The pointer data for this pointer is always at index 0 in the MotionEvent. ACTION_POINTER_DOWN—For extra pointers that enter the screen beyond the first. The pointer data for this pointer is at the index returned by getActionIndex(). ACTION_MOVE—A change has happened during a press gesture. ACTION_POINTER_UP—Sent when a non-primary pointer goes up. ACTION_UP—Sent when the last pointer leaves the screen

ddfd override onTouchEvent() method and check these events manually. Its syntax is given below ************ public boolean onTouchEvent(MotionEvent ev){ final int variableAction = ev.getAction(); switch(variableAction){ case MotionEvent.ACTION_DOWN:{ // performs the activities you want it to perform break; } case MotionEvent.ACTION_MOVE:{ return true;

Need to implement: Gesturedetector.OnGestureListener GestureDetector.OnDoubleTapListener >> include the onDoubleTap() method // assuming you want something done when the screen is double-tapped. Example: Public class MainActivity extends Activity implements( GestureDetector.OnGestureListener, GestureDetector.OnDoubleTapListener{ .

@Override Public boolean onDoubleTap(MotionEvent event){ Log.d(DEBUG_TAG, “OnDoubleTap”, +event.toString()); return true; } Creation of an instance of the Android GestureDetectorCompact … Rule: ***********Called when the activity is first created // called when the activity is first created protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_main); // // instantiate with an application context and implementation of GestureDetector.OnGestureListener, GestureDetector.OnDoubleTapListener kennyDetector = new GestureDetectorCompact(this, this); kennyDetector.setOnDoubleTapListener(this);

Detecting Gestures Source Wikipedia Android provides the GestureDetector class for detecting common gestures. onDown(), onLongPress(), onFling(), etc. Combine GestureDetector class with onTouchEvent() method described above. Android provides the GestureDetector class for detecting common gestures. onDown(), onLongPress(), onFling(), etc. Combine GestureDetector class with onTouchEvent() method described above. Source Wikipedia

Implementation of the onTouchEvent() callback method User places his fingers on a screen. It triggers the callback onTouchEvent() Once a touch is identified as a gesture, the onTouchEvent() method will be triggered. Android GestureDetector class is used for detecting common gestures. Examples of those supported: >>> onDown(), onLongPress(), onFling() GestureDetector can be used with onTouchEvent()

To detect any of the those mention events: Override onTouchEvents() method To get the co-ordinates of the X and Y axis, you can call getX() and getY() method. Its syntax is given below // to get x or y values final float x = ev.getX(); final float y = ev.getY();

Some other methods provided by MotionEvent class and description getAction() returns the kind of action being performed getPressure() returns the current pressure of this event for the first index getRawX() method returns the original raw X coordinate of this event getRawY() returns the original raw Y coordinate of this event getSize() returns the size for the first pointer index getSource() gets the source of the event getXPrecision() method return the precision of the X coordinates being reported getYPrecision() method return the precision of the Y coordinates being reported

private int mActivePointerId; public boolean onTouchEvent(MotionEvent event) { .... // Get the pointer ID mActivePointerId = event.getPointerId(0); // ... Many touch events later... // Use the pointer ID to find the index of the active pointer // and fetch its position int pointerIndex = event.findPointerIndex(mActivePointerId); // Get the pointer's current position float x = event.getX(pointerIndex); float y = event.getY(pointerIndex); }

protected void onCreate(Bundle savedInstanceState) { @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_main); edit1 = (EditText) findViewById(R.id.editText); edit2 = (EditText) findViewById(R.id.editText2); edit3 = (EditText) findViewById(R.id.editText3); edit4 = (EditText) findViewById(R.id.editText4); tv1=(TextView)findViewById(R.id.textView2); tv1.setOnTouchListener(new View.OnTouchListener() { public boolean onTouch(View v, MotionEvent event) { final int variableAction = event.getAction(); switch(variableAction){ case MotionEvent.ACTION_DOWN:{ final float x = event.getX(); final float y = event.getY(); lastXAxis = x; lastYAxis = y; // Continue here edit1.setText(Float.toString(lastXAxis)); edit2.setText(Float.toString(lastYAxis)); break; } case MotionEvent.ACTION_MOVE:{ final float x = event.getX(); final float y = event.getY(); final float dx = x - lastXAxis; final float dy = y - lastYAxis; xAxis += dx; yAxis += dy; edit3.setText(Float.toString(xAxis)); edit4.setText(Float.toString(yAxis)); return true; });

Source Material Design Swipe Features Swipe gesture activities vary based on context. The speed at which a gesture is performed is the primary distinction between Drag, Swipe, and Fling. Drag: Fine gesture, slower, more controlled, typically has an on-screen target Swipe: Gross gesture, faster, typically has no on-screen target Fling: Gross gesture, with no on-screen target Gesture velocity impacts whether the action is immediately reversible. A swipe becomes a fling based on ending velocity and whether the affected element has crossed a threshold (or point past which an action can be undone). Source Material Design https://material.io/guidelines/patterns/gestures.html#gestures-touch-activities

So many resources can be gotten from the url below: >> Pattern Design for mobile Devices …. A drag maintains contact with an element, so reversing the direction of the gesture will drag the element back across the threshold. A fling moves at a faster speed and removes contact with the element while it crosses the threshold, preventing the action from being undone. CONCLUSION So many resources can be gotten from the url below: https://developer.android.com/training/gestures/multi.html

References https://developer.android.com/training/gestures/multi.html https://developer.android.com/training/gestures/multi.html#action https://material.io/guidelines/patterns/gestures.html#gestures-touch-activities