Interaction techniques for post-WIMP interfaces Lawrence Sambrooks Supervisor: Dr Brett Wilkinson.

Slides:



Advertisements
Similar presentations
San Diego SPUG -
Advertisements

Multi-Modal Text Entry and Selection on a Mobile Device David Dearman 1, Amy Karlson 2, Brian Meyers 2 and Ben Bederson 3 1 University of Toronto 2 Microsoft.
Kinect + TFS aka Kinban Jeremy Novak Farm Credit Services of America.
User Interface Design Yonsei University 2 nd Semester, 2013 Sanghyun Park.
Lesson 6 Software and Hardware Interaction
SNOUT: One-Handed use of Capacitive Touch Devices Adam Zarek, Daniel Wigdor, Karan Singh University of Toronto.
User Testing & Experiments. Objectives Explain the process of running a user testing or experiment session. Describe evaluation scripts and pilot tests.
Class 6 LBSC 690 Information Technology Human Computer Interaction and Usability.
Lecture 1: Intro to Computers Yoni Fridman 6/28/01 6/28/01.
1 / 31 CS 425/625 Software Engineering User Interface Design Based on Chapter 15 of the textbook [SE-6] Ian Sommerville, Software Engineering, 6 th Ed.,
People & Devices: (Inputs & Outputs) Startlingly small child using computer History of human-computer interaction Another history video.
CS335 Principles of Multimedia Systems Multimedia and Human Computer Interfaces Hao Jiang Computer Science Department Boston College Nov. 20, 2007.
Objectives Define predictive and descriptive models and explain why they are useful. Describe Fitts’ Law and explain its implications for interface design.
Support for Palm Pilot Collaboration Including Handwriting Recognition.
Discussion Silvia Lindtner INF 132 April 07. Fitts’ law - recap A predictive model of time to point at an object Help decide the location and size of.
Microsoft Visual Basic 2012 CHAPTER ONE Introduction to Visual Basic 2012 Programming.
The Implementation of a Glove-Based User Interface Chris Carey.
Introduction Kinect for Xbox 360, referred to as Kinect, is developed by Microsoft, used in Xbox 360 video game console and Windows PCs peripheral equipment.
Multimedia Specification Design and Production 2013 / Semester 2 / week 8 Lecturer: Dr. Nikos Gazepidis
Multi-interface gesture based organic modelling Bradley Wesson.
Introduction to Mobile Programming. Slide 2 Overview Fundamentally, it all works the same way You get the SDK for the device (Droid, Windows, Apple) You.
INTERACTING WITH SIMULATION ENVIRONMENTS THROUGH THE KINECT Fayez Alazmi Supervisor: Dr. Brett Wilkinson Flinders University Image 1Image 2 Source : 1.
Multi-device Organic 3D Sculpting through Natural User Interface Gestures BSci Honours - Bradley Wesson Supervisor – Brett Wilkinson.
Interaction Gavin Sim HCI Lecture /111. Aims of this lecture Last week focused on persona and scenario creation. This weeks aims are: ◦ To introduce.
User interface design. Recap OOD is an approach to design so that design components have their own private state and operations Objects should have constructor.
CSC 480 Software Engineering Lecture 19 Nov 11, 2002.
Interaction techniques for post-WIMP interfaces Lawrence Sambrooks Supervisor: Dr Brett Wilkinson.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e (McGraw-Hill, 2009) Slides copyright 2009 by Roger Pressman.1.
Project By: Brent Elder, Mike Holovka, Hisham Algadaibi.
Behaviour Models There are a number of models that predict the way in which an interface or user will behave.
Final Honours Presentation Principal Investigator: João Lourenço Supervisor: Dr Hannah Thinyane.
GOMS Timing for WIMP interfaces When (fine-grained) speed matters.
Project By: Brent Elder, Mike Holovka, Hisham Algadaibi.
Assistive Technology. Assistive Technology is any tool that helps students with disabilities do things more quickly, easily, or independently. What is.
INTRO TO USABILITY Lecture 12. What is Usability?  Usability addresses the relationship between tools and their users. In order for a tool to be effective,
Software Architecture
Chapter 8 Usability Specification Techniques Hix & Hartson.
A Look To The Future Next-Generation User Interfaces By: John Garcia.
Pervasive Gaming with Mobile Devices KARNUNG LIANG SUPERVISOR: DR BRETT WILKINSON.
SkyNET Visualization Team Demo and Architecture Overview.
Controlling Computer Using Speech Recognition (CCSR) Creative Masters Group Supervisor : Dr: Mounira Taileb.
The Implementation of a Glove-Based User Interface Chris Carey.
Conceptual Design Dr. Dania Bilal IS588 Spring 2008.
Team Members Ming-Chun Chang Lungisa Matshoba Steven Preston Supervisors Dr James Gain Dr Patrick Marais.
7th Meeting TYPE and CLICK. Keyboard Keyboard, as a medium of interaction between user and machine. Is a board consisting of the keys to type a sentence.
Reducing uncertainty in speech recognition Controlling mobile devices through voice activated commands Neil Gow, GWXNEI001 Stephen Breyer-Menke, BRYSTE003.
Adventure Game for Children With Disabilities Luke Innes Supervisor: Brett Wilkinson.
1 Interaction Devices CIS 375 Bruce R. Maxim UM-Dearborn.
Pen Based User Interface Issues CSE 490RA January 25, 2005.
 Real world › Bimanual  Mouse and keyboard? › Unintuitive › One point  Special device? › Expensive › Still not very intuitive.
Microsoft Visual Basic 2015 CHAPTER ONE Introduction to Visual Basic 2015 Programming.
KINECT AMERICAN SIGN TRANSLATOR (KAST)
Standard Methods of Input.
Human Computer Interaction (HCI)
Introduction to Visual Basic 2008 Programming
Google translate app demo
Tsvetomir Ross-Lazarov Bruce Muller September 2015 UCAR/COMET
P1: Smart Phone Interaction Workaround
Translating and the Computer London, 16 November 2017
Objectives To define terminology associated with Windows operating systems. To examine uses of Windows in business and industry. To explain techniques.
NBKeyboard: An Arm-based Word-gesture keyboard
Multimodal FooBilliard
CIS16 Application Development – Programming with Visual Basic
Wearable Devices. Wearable Devices Wearable Interfaces Wearable interfaces are the interfaces used to interact with wearable devices while they.
The Implementation of a Glove-Based User Interface
Multimodal Human-Computer Interaction New Interaction Techniques 22. 1
Lesson 4 Alternative Methods Of Input.
Human and Computer Interaction (H.C.I.) &Communication Skills
Jeremy Foster Michael Palermo
Evaluation of a multimodal Virtual Personal Assistant Glória Branco
Presentation transcript:

Interaction techniques for post-WIMP interfaces Lawrence Sambrooks Supervisor: Dr Brett Wilkinson

Summary Recap Project changes Timeline Issues Development of applications User testing Initial results Remaining tasks 2

Recap 3 Recent exposure and usage of alternative interaction devices for post-WIMP interfaces; for example, touch with tablets, Kinect gestures with Xbox 360 Investigate touch and gesture interaction techniques and compare them to the mouse and keyboard Develop applications to test performance and usability

Project changes 4 Added Fitts’ law style application to evaluate performance Replaced data visualisation application with 3D cube docking/alignment application Inclusion of speech for gestural interaction Studies suggest preference for using gestures and speech together (Hauptmann 1989) “Put-That-There” system by Bolt (1980)

Timeline 5

6

Issues 7 Performance of Windows-based tablet Poor touch responsiveness Not enough CPU and GPU “grunt” Effected development of application 4 in 3D (touch) Changes to Kinect SDK during development Beta 2 to version 1.0 Added new functionality but changed the public interface Getting a reasonable number of participants for testing

Development 8 4 applications Written in C# and WPF All applications run full-screen at 1280x

Development 9 Tools Visual Studio 2010 Expression Blend 4 Libraries.NET Framework 4.0 Kinect SDK beta2 / 1.0 Helix Toolkit Coding4Fun Kinect Toolkit Petzold.Media3D

Application 1 10 Performance evaluation using Fitts’ law 100 randomly sized and positioned squares Divided into groups of second break in-between each group Accepts mouse, touch, and Kinect (gesture) input

Screenshot 11 Application 1

Applications D cube docking/alignment task 10 pseudo-random targets Manipulation via Rotate-Scale-Transform (RST) operations Separate application for each interaction technique Each application shares common library (CubeEx.dll)

Gestures and speech 13 Gestures control manipulation One hand (left or right) for translation Two hands for scale and rotation Speech invokes commands Audible and visual feedback when command is recognised 70% confidence threshold for recognition Not perfect! Application 2

Screenshot 14 Application 2

Mouse and keyboard 15 Mouse controls manipulation of camera/cube Left-click-and-drag to translate cube Keyboard used to invoke scale and rotate modes Hold Z to scale Hold X to rotate Application 3

Multi-touch 16 Tablet performance prohibited creation of 3D application Poor touch responsiveness Not enough CPU and GPU “grunt” Due to time available, compromised on 2D version Still allows some evaluation of usability Application 4

Screenshot 17 Application 4

User testing 18 2 experiments utilising all 4 applications Experiment 1: Performance Experiment 2: Usability 15 participants Repeated-measures design Order of interaction techniques shuffled No practice time Questionnaire at the conclusion of each experiment Multiple-choice, short answer, and Likert scale

Initial results 19 On average, touch performed best (9.83 bits/s) followed by the mouse (7.76 bits/s) and gestures (1.23 bits/s) It took ~3.5 times longer using gestures to acquire targets compared with mouse/touch Participants missed an average of 6 targets (per 100) using the mouse, 12 using touch, and 32 using gestures Experiment 1 – Performance

Results browser 20 Experiment 1 – Performance

Initial results 21 Previous experience 85%+ with multi-touch (attributed to smartphone/tablet ownership) About 50% with gestures Participants overwhelmingly preferred the mouse Mouse had lowest number of “misses;” touch and gestures were both around 3 times worse Main reason cited was familiarity followed by speed and precision Experiment 2 – Usability

Initial results 22 Participant rating of performance for techniques used Mouse and touch best for translation Mouse best for rotation Mouse best for scaling Noted issues/frustrations Lack of precision when using gestural interaction Accuracy of voice recognition Gulf of execution for Kinect (gesture) task Experiment 2 – Usability

Results browser 23 Experiment 2 – Usability

Remaining tasks 24 In-depth analysis of results Prepare for expo (poster, application demo, etc.) Complete thesis

Questions?