1 The subArctic Input System and Extensions for Handling Inputs with Ambiguity.

Slides:



Advertisements
Similar presentations
Introduction to Java 2 Programming
Advertisements

implementation support
Why am I here Development techniques are headed down the wrong path! Our world should be getting simpler – it is not The solutions that will solve the.
CS0004: Introduction to Programming Visual Studio 2010 and Controls.
Interaction Techniques for Ambiguity Resolution in Recognition-based Interfaces Jennifer Mankoff CoC & GVU Center Georgia Tech.
C MU U sable P rivacy and S ecurity Laboratory Sensor-Based Interactions Kami Vaniea.
Going Beyond Conventional GUIs. 2 Changing the assumptions n What happens when we step outside the conventional GUI / desktop / widgets framework? – Topic.
Jason Hong James Landay A. Chris Long Jennifer Mankoff Sketch Recognizers from the End-User’s, the Designer’s, and the Programmer’s Perspective.
John Hu Nov. 9, 2004 Multimodal Interfaces Oviatt, S. Multimodal interfaces Mankoff, J., Hudson, S.E., & Abowd, G.D. Interaction techniques for ambiguity.
Automating Tasks With Macros
Chapter Day 10. © 2007 Pearson Addison-Wesley. All rights reserved4-2 Agenda Day 10 Questions from last Class?? Problem set 2 posted  10 programs from.
Nov Jason Hong and James Landay University of California Berkeley Group for User Interface Research.
Object-Oriented Analysis and Design
Chapter 7 Improving the User Interface
Stanford hci group / cs376 research topics in human-computer interaction I/O Toolkits Scott Klemmer 29 November 2005.
1 The subArctic Input System and Extensions for Handling Inputs with Ambiguity.
Stanford hci group / cs376 research topics in human-computer interaction UI Software Tools Scott Klemmer 27 October 2005.
01-Intro-Object-Oriented-Prog-Alice1 Barb Ericson Georgia Institute of Technology Aug 2009 Introduction to Object-Oriented Programming in Alice.
What is Architecture  Architecture is a subjective thing, a shared understanding of a system’s design by the expert developers on a project  In the.
Model View Controller (MVC) Rick Mercer with a wide variety of others 1.
Welcome to CIS 083 ! Events CIS 068.
Invitation to Computer Science, Java Version, Second Edition.
Automating Database Processing Chapter 6. Chapter Introduction Design and implement user-friendly menu – Called navigation form Macros – Automate repetitive.
Automated GUI testing How to test an interactive application automatically?
Lecture 5: Interaction 1  Principles of Interactive Graphics  CMSCD2012  Dr David England, Room 711,  ex 2271 
1 Implementation support chapter 8 programming tools –levels of services for programmers windowing systems –core support for separate and simultaneous.
Lab 6: event & input intro User Interface Lab: GUI Lab Oct. 2 nd, 2013.
 2002 Prentice Hall, Inc. All rights reserved Introduction Graphical User Interface (GUI) –Gives program distinctive “look” and “feel” –Provides.
Automated GUI testing How to test an interactive application automatically?
Chapter 8 Object Design Reuse and Patterns. Object Design Object design is the process of adding details to the requirements analysis and making implementation.
240-Current Research Easily Extensible Systems, Octave, Input Formats, SOA.
Model View Controller A Pattern that Many People Think They Understand, But Has A Couple Meanings.
Game Maker – Getting Started What is Game Maker?.
MIT 6.893; SMA 5508 Spring 2004 Larry Rudolph Lecture 4: Graphic User Interface Graphical User Interface Larry Rudolph MIT 6.893; SMA 5508 Spring 2004.
Toolglasses and Magic Lenses. 2 Reading: Eric A. Bier, Maureen C. Stone, Ken Pier, William Buxton and Tony D. DeRose, “Toolglass and magic lenses: the.
Object Oriented Programming.  Interface  Event Handling.
Computer Graphics: Programming, Problem Solving, and Visual Communication Steve Cunningham California State University Stanislaus and Grinnell College.
1 Jennifer Mankoff CoC & GVU Center Georgia Tech Programming Support for Natural Interaction.
Copyright © 2006 – Brad A. Myers Answering Why and Why Not Questions in User Interfaces Brad Myers, David A. Weitzman, Andrew J. Ko, and Duen Horng (“Polo”)
L10: Model-View-Controller General application structure. User Interface: Role, Requirements, Problems Design patterns: Model – View – Controller, Observer/Observable.
Basic Organization of UI Software. 2 The User Interface n Typically want to think of “UI” as only one component of an overall system – The part that “deals.
1 G4UIRoot Isidro González ALICE ROOT /10/2002.
Pen Based User Interface Issues CSE 490RA January 25, 2005.
Architectural Mismatch: Why reuse is so hard? Garlan, Allen, Ockerbloom; 1994.
CHAPTER 4 Fragments ActionBar Menus. Explore how to build applications that use an ActionBar and Fragments Understand the Fragment lifecycle Learn to.
12-Jun-16 Event loops. 2 Programming in prehistoric times Earliest programs were all “batch” processing There was no interaction with the user Input Output.
High degree of user interaction Interactive Systems: Model View Controller Presentation-abstraction-control.
MapReduce “MapReduce allows us to stop thinking about fault tolerance.” Cathy O’Neil & Rachel Schutt, 2013.
Forms Concepts Triggers Fired when Internal/External events occur
GUI Design and Coding PPT By :Dr. R. Mall.
Introduction to Event-Driven Programming
Event loops 16-Jun-18.
Java Look-and-Feel Design Guidelines
Lecture 27 Creating Custom GUIs
Introduction to Events
User Interface Software Look under the hood
Introduction to Computing Using Java
Hands-on Introduction to Visual Basic .NET
Event loops.
implementation support
12/5/2018 The subArctic Input System and Extensions for Handling Inputs with Ambiguity.
Event loops 17-Jan-19.
Event loops 17-Jan-19.
I/O Toolkits Scott Klemmer · 16 November 2006.
Event loops 8-Apr-19.
Programming Support for Natural Interaction
implementation support
Event loops.
Architectural Mismatch: Why reuse is so hard?
Event loops 19-Aug-19.
Presentation transcript:

1 The subArctic Input System and Extensions for Handling Inputs with Ambiguity

2 subArctic n A Java-based GUI toolkit that I (along with Ian Smith) built and distributed in n Goal: highly extensible allowing support for lots of cool new interaction techniques – Emphasis on making new and strange widgets / components / interactors easy to create – “High ceiling”

3 Parties involved with a toolkit n Toolkit designer (me) n Interactor designer n Interface programmer n User

4 Parties involved with a toolkit n Toolkit designer (me) n Interactor designer n Interface programmer n User Most toolkits target support here

5 Parties involved with a toolkit n Toolkit designer (me) n Interactor designer n Interface programmer n User By moving work up (into reusable library)

6 Parties involved with a toolkit n Toolkit designer (me) n Interactor designer n Interface programmer n User But typically don’t help much here (assume a fixed library)

7 subArctic n Toolkit designer (me) n Interactor designer n Interface programmer n User SA tries to move work for many kinds of interactors into toolkit infrastructure

8 subArctic n Toolkit designer (me) n Interactor designer n Interface programmer n User SA tries to move work for many kinds of interactors into toolkit infrastructure Input system is a big part of that

9 Schema for pretty much all GUIs init(); for (;;) { evt = wait_for_next_event(); dispatch(evt); if ( damage_exists() ) redraw(); }

10 Schema of a GUI init(); for (;;) { evt = wait_for_next_event(); dispatch(evt); if ( damage_exists() ) redraw(); } Event Record – recording of the relevant facts about some occurrence of interest (i.e., user has manipulated an input device)

11 Schema of a GUI init(); for (;;) { evt = wait_for_next_event(); dispatch(evt); if ( damage_exists() ) redraw(); } Send (“dispatch”) the event to the object(s) that want it and/or know how to respond to it (e.g., widget/component/interactor)

12 Event dispatch n All the work happens here n Typically delegated to interactors – E.g., buttons know how to respond to press and release like buttons should – Each object keeps track of its own state n... but which interactor gets it  Toolkit “event dispatch” process

13 Event dispatch policies n Two primary ways to decide which interactor gets an event n What are they?

14 Event dispatch policies n Two primary ways to decide which interactor gets an event – Positional dispatch F Based on where mouse is pointing F Examples… – Focus-based dispatch F Designated object always gets input F Examples…

15 Pop quiz n Should input for dragging be dispatched via positional or focus?

16 Pop quiz n Should input for dragging be dispatched via positional or focus? Answer: No! (both)

17 subArctic input policies n subArctic encapsulates these “ways of dispatching inputs” in “dispatch policy objects” – Manages bookkeeping (e.g., picking) – Extensible set F Turns out there are other useful policies (e.g., for modal dialogs)

18 When interactors get events… n … they typically respond to them with the equivalent of a simple finite state machine Press Move Release

19 subArctic has lib of common FSMs n Move a lot of input handling work typically done by interactor programmer up into the toolkit n One (highly parameterized) FSM for all – Brad’s “interactor” model (awful terminology :-) n Many customized FSM (extensible set) – subArctic input model

20 FSMs moved to toolkit object n “Dispatch agent” n Translates low level input into higher level terms

21 Dispatch agent example: move_drag n Translated to calls in input protocol: – drag_start(); – drag_feedback(); – drag_end(); n With useful parameters (e.g. new pos) Press Move Release

22 Dispatch agent example: move_drag n Translated to calls in input protocol: – drag_start(); – drag_feedback(); – drag_end(); n With useful parameters (e.g. new pos) Press Move Release Defined by Java interface

23 Set of dispatch agents is extensible n E.g., can subclass for specialized kinds of drag such as “drag_within_box” or “snap_drag” – Can create custom for one interface – Once created can reuse

24 How it all goes together Focus Policy Positional Policy Etc… Events Press Click Rollover Etc... Text Move drag Grow drag Etc...

25 How does interactor indicate it wants / can handle some type of input? n “… implements input_protocol” – Where “ input_protocol ” is interface with calls like drag_start(), etc. n For positional that’s it! n For focus-based must also ask for focus

26 Example: Hypertext for all n User (Ken Anderson) wanted to add hyperlinks to all objects – Hold down the control key and click – His external hyperlink database would take over and map interactor id to hyperlink target – But… how do you change every interactor to do this?

27 Example: Hypertext for all n In Swing, Motif, etc. this is essentially impossible n In SA, just insert a new subclass of the “click” dispatch agent that checks for the control key down – About 15 lines of code – Works for interactors written later!

28 Questions about the SA input system?

29 Providing Toolkit Level Support for Handling Ambiguity in Recognition- Based Input See: and Jennifer Mankoff, Gregory Abowd Georgia Institute of Technology Scott Hudson Carnegie Mellon University

30 Motivation n Recognition-based input offers the promise of naturalistic input modalities, BUT…

31 Motivation n Recognition-based input offers the promise of naturalistic input modalities, BUT… n Recognizers are imperfect – affects users – breaks current system models è New interfaces & mechanisms

32 Example Interaction n From Newton n Handwritten text

33 Example Interaction n From Newton n Handwritten text automatically replaced with best recognition result test

34 Example Interaction n Double-tap to get a correction interactor test

35 Example Interaction n Correction interactor (mediator) test text teat ted N-Best List Keyboard Character Correction Revert to Strokes

36 Example Interaction n Works well, but… – Not reusable or customizable – Hard to grow your own è Basically we don’t have toolkit support for recognition based UI

37 Motivation (cont.) n At much the same stage we were at for GUIs in 1983 – No common model for input – No re-use F Infrastructure F “widget library”

38 Goals of This Work n Robust, reusable infrastructure n Reusable library n Integrate with convent. toolkit – Don’t throw out the baby with the bathwater

39 Talk Roadmap n Requirements for handling uncertain input n Extending toolkits to handle it n Interaction techniques for ambiguity n Implementation

40 Invoking Application Actions n Action often done by callbacks – Direct procedure call to application n Hierarchical events are alternate approach – Delivered to app as well as toolkit

41 Hierarchical Events n Low-level events contribute to production of higher-level events [Green TOG ‘86; Myers & Kosbie CHI ‘96] User Input circle stroke downdragup Corresponding Events

42 Implicit Assumption of Certainty n Implicit in all this is the assumption that the events really happened as reported n Problems arise when this isn’t true – E.g., brittle dialogs

43 Needed to Handle Uncertainty: n Allow for (and explicitly model) multiple alternatives – alternative higher level events – in recognition context: interpretations n Detect conflicting interpretations n Mediation of conflicts

44 Needed to Handle Uncertainty: n Lexical feedback about uncertain events – split “feedback” from “action” n Library of mediators

45 How do we do this...

46 Extended Event Model n Uncertainty results in multiple interpretations  interpretation graph Uncertain Input circlebox stroke downdragup circle stroke downdragup Certain Input

47 Toolkit Extensions n Toolkit’s job is still to deliver events to objects – Now delivered to recognizers, interactors, and application objects Button Checkbox Menu Recog

48 Toolkit Extensions n Toolkit’s job is still to deliver events to objects – Objects initially only produce (reversible) feedback, no actions Button Checkbox Menu Recog

49 Another Change: Interface Appearance Uncertain Event Hierarchy circlebox stroke downdragup n Events dispatched to all who might use it

50 Details: Arranging for Mediation n Identify any conflicts n Look for a mediators – Pluggable list of them in toolkit n Mediator chosen by meta- mediator n Mediator can: “Pass”, “Pause”, “Accept”

51 Doing Mediation n Example: User selects interpretation circle box circle

52 Doing Mediation (cont.) n Mediator prunes interpretation graph to tree – App informed of accept & reject circlebox stroke downdragup circle stroke downdragup

53 Mediation Strategies n Many mediation strategies – e.g., Automatic vs. user involvement n Toolkit is fully “pluggable” (!) – Library of mediators provided, but – Can extend/build new ones as needed n Research goal:Finding new ones

54 Providing a Library of Mediators

55 Providing a Library of Mediators n Survey of existing techniques [Abowd & Mankoff GVU Tech Report 99] – Automatic – User Involvement F Repetition & repair strategies F Choice strategies

56 Automatic Mediation Techniques n Probability modeling and thresholding n Historical statistics n Rule-based

57 User Involvement: Repetition & Repair Strategies n Undo and repeat n Change of modality n Partial repair

58 User Involvement: Choice Strategies n Variations on N-best “lists” – Presentation form – Instantiation time – Contextual information – Interaction – Feedback n Many techniques via “parameterization” – Ripe for toolkit support

59 Implementation n Added to subArctic toolkit – Reusable – Fully “pluggable” – Full existing library still works as is (!) n Small library of mediators n Also working on non-GUI toolkit

60 Experience n Major example: Burlap – Smaller version of SILK [Landay] – For sketching UI designs and turning them into functioning interfaces

61 Conclusions n Reusable infrastructure to support ambiguous input – Reduces difficulty of creating UIs – Easier to explore new design space n Done by modifying a toolkit, not a separate mechanism – Integrated with conventional input – Other support from toolkit still useful

62