Presentation is loading. Please wait.

Presentation is loading. Please wait.

Touch and Go: Leading Touch UI with Open Source

Similar presentations


Presentation on theme: "Touch and Go: Leading Touch UI with Open Source"— Presentation transcript:

1 Touch and Go: Leading Touch UI with Open Source
Presentation by Find us in #ubuntu-touch on Freenode

2 Agenda A brief history of multitouch platforms
Extrapolations from single application interfaces to windowed desktop environments Gestures and an overview of the uTouch gesture stack The future of uTouch and gestures

3 A brief history of multitouch platforms
First multitouch system in 1982, by University of Toronto Input Research Group Pierre Wellner's “Digital Desk” in 1991 revolutionarily explored multitouch use cases Fingerworks developed multitouch technologies from 1999 to before being acquired by a fruity company And then there was one more thing...

4 A brief history of multitouch platforms
The iPhone was the first mainstream multitouch platform Introduced consumers to intuitive two-touch gestures Pinch to zoom Drag to pan content First gaming platform with multitouch

5 A brief history of multitouch platforms
What about computers?

6 A brief history of multitouch platforms
Trackpads have long supported “multifinger” functionality Definition: A single point and the number of touches Used for two finger scrolling, two- and three-finger tap Apple introduced full multitouch trackpads in the late 2008 Macbook Pro models Most laptops now have some form of multifinger, semi-multitouch, or full multitouch trackpads

7 A brief history of multitouch platforms
Touch events propagate to a window on screen and then are “grabbed” by the window until they end Works great for touchscreens Issues with touchpads: Use case: two finger scroll, lift one finger, move pointer to another window, touch second finger to scroll again The finger that remained in contact has to change propagation from one window to another

8 A brief history of multitouch platforms
Mobile devices (so far): Touchscreen support only Only one full-screen application at a time No system-level gestures OS X: Trackpad support only Mixture of system- and application- level gestures Windows 7:

9 A brief history of multitouch platforms
What can we learn from these implementations?

10 A brief history of multitouch platforms
Issue: System- vs application-level gestures Apple has tried to introduce system-level gestures after release Backed down when applications broke Conclusions: Start with a mix of system and application gestures Possibly make it configurable by the application

11 A brief history of multitouch platforms
Issue: Touch vs gesture event propagation and delivery Windows only allows a window to exclusively receive touch or gesture events As a result, applications tend to do their own gestures OS X mixes touch and gesture events in one stream An application can handle gestures and follow touches simultaneously Conclusions: Make touch and gesture events usable together

12 A brief history of multitouch platforms
Issue: Trackpad one- and multi-touch interactions Should a touch be canceled if the focused window changes? When, if ever, can a touch change the window it propagates events to? Conclusions: TBD! (Note that in OS X, two-finger scroll is handled on the window server side before touch and gesture events are generated)

13 Gestures and the X11 uTouch stack
Multitouch gestures are a means of simplifying touch event handling The corollary is that gesture event handling is complex! Some issues: Building on X11 Resource and power constraints Consistent “feel” Low latency of touch events Support multiple distinct simultaneous gestures

14 Gestures and the X11 uTouch stack
uTouch is currently based on the X.org X11 window system X11: is almost 24 years old! was not designed for input other than one KB and mouse is tied to the original protocols requires thorough protocol and development review to be extended These are constraints on any X11 based multitouch gesture system

15 Gestures and the X11 uTouch stack
Further constraints we chose for uTouch Only one gesture recognizer instance Uniform “feel” across desktop Recognition occurs only once no matter who is the client Potential for multiple distinct simultaneous gestures Support both multitouch and gestures together and separately Low latency of events

16 Gestures and the X11 uTouch stack
First half of solution: X.org XInput 2.2 multitouch Supports touchpads and touchscreens Clients control delivery of events from given touches Must accept or reject touches they are in control of May receive events for touches they do not control yet The first touchscreen touch may be transformed into pointer motion

17 Gestures and the X11 uTouch stack
Second half of solution: X11 client-side uTouch stack Grabs touches on root window (i.e. entire desktop) Detects all possible combinations of gestures Limited by: What is subscribed Inferences from system- vs application-level gestures If no gestures, reject touches so other clients may handle Begin sending gestures to uTouch clients uTouch clients accept or reject gestures Propagate control to XI 2.2 touch accept/reject

18 Gestures and the future
What about Wayland?

19 Gestures and the Future
Wayland has two key advantages over X11: It is new and has the benefit of hindsight The protocol is not set in stone Wayland 1.0 will likely be released with basic touch support Richer touch and gesture support can be added later Notably, gestures may still be added as a first-class citizen

20 Gestures and the Future
In the X11 uTouch stack, gestures have round trips to sort out acceptance/rejection: uTouch Service X Server Application

21 Gestures and the Future
In a Wayland uTouch stack, gestures can be interleaved with touch events: Wayland-based Compositor Application uTouch Service

22 Questions please Thank you


Download ppt "Touch and Go: Leading Touch UI with Open Source"

Similar presentations


Ads by Google