VR Interfaces – Navigation, Selection and UI Elements By David Johnson.

Slides:



Advertisements
Similar presentations
Copyright All Rights Reserved 1 Chapter 3 WIMP 3.1 Definitions 3.2 Alternative Expansions Window(s) Icon(s) Menu(s) Advantages.
Advertisements

Graphical input techniques
Regis Kopper Mara G. Silva Ryan P. McMahan Doug A. Bowman.
Input and Output Devices. I/O Devices: Input information data An input device one that, together with appropriate software, transforms information from.
CS133 Input and output devices
Copyright 1999 all rights reserved Input Devices n What types are there? n Why do we need them? –What functions do they perform? n What are desirable characteristics.
SNOUT: One-Handed use of Capacitive Touch Devices Adam Zarek, Daniel Wigdor, Karan Singh University of Toronto.
Virtual Reality Adapted from a lecture by David Johnson Fall CS5540/6540 HCI.
XP Exploring the Basics of Microsoft Windows XP1 Exploring the Basics of Windows XP.
Class 6 LBSC 690 Information Technology Human Computer Interaction and Usability.
Lecture 7 Date: 23rd February
CMPUT 301: Lecture 25 Graphic Design Lecturer: Martin Jagersand Department of Computing Science University of Alberta Notes based on previous courses by.
Input: Devices and Theory. Input for Selection and Positioning Devices Power Law of Practice Fitt’s Law (2D, 3D lag) Eye hand coordination Two handed.
Selected Topics in 3D User Interfaces Joseph J. LaViola Jr. CS March 6, 2006.
Ch 7 & 8 Interaction Styles page 1 CS 368 Designing the Interaction Interaction Design The look and feel (appearance and behavior) of interaction objects.
Using Tweek to Create Graphical User Interfaces in Virtual Reality Patrick Hartling IEEE VR 2003.
Lecture 5: Interaction and Navigation Dr. Xiangyu WANG Acknowledge the notes from Dr. Doug Bowman.
User Interface Design Users should not have to adapt to a piece of software; the software should be designed to fit the user.
3D Interaction Techniques for Virtual Environments
Discussion Silvia Lindtner INF 132 April 07. Fitts’ law - recap A predictive model of time to point at an object Help decide the location and size of.
User Interface Development Human Interface Devices User Technology User Groups Accessibility.
1 CGS1060 Mobile UIs Copyright 2012 by Janson Industries.
Interaction in Virtual Environments Benjamin Lok This Lecture contains notes created by Doug Bowman Virginia Tech Sherman & Craig, pp
Standard Grade Computing COMPUTER STUDIES Standard Grade INPUT DEVICES Chapter 16.
Interfaces and interactions 1980’s
Ch 26 & 27 User Interfaces.
Virtual Reality David Johnson. What is Virtual Reality?
Interaction in the Virtual World: Overview
Towards a Unified Interaction Framework for Ubicomp User Interfaces Jason I. Hong Scott Lederer Mark W. Newman G r o u p f o r User Interface Research.
GUI: Specifying Complete User Interaction Soft computing Laboratory Yonsei University October 25, 2004.
Ch 6 - Menu-Based and Form Fill-In Interactions Yonglei Tao School of Computing & Info Systems GVSU.
Computers in the real world Objectives Understand the terms input and output Look at different types of input devices – Sensors / actuators – Human interface.
Interacting with Visualization Colin Ware, Information Visualization, Chapter 10, page 335.
VE Input Devices(I) Doug Bowman Virginia Tech Edited by Chang Song.
Multimedia Specification Design and Production 2013 / Semester 2 / week 8 Lecturer: Dr. Nikos Gazepidis
11.10 Human Computer Interface www. ICT-Teacher.com.
CS 4720 Usability and Accessibility CS 4720 – Web & Mobile Systems.
Input Devices. What is Input?  Everything we tell the computer is Input.
Augmented and mixed reality (AR & MR)
Design of Everyday Things - Donald Norman CS A470.
Productivity Programs Common Features and Commands.
VIRTUAL REALITY (VR) INTRODUCTION AND BASIC APPLICATIONS الواقع الافتراضي : مقدمة وتطبيقات Dr. Naji Shukri Alzaza Assist. Prof. of Mobile technology Dean.
Virtual Reality Lecture2. Some VR Systems & Applications 고려대학교 그래픽스 연구실.
3D Interaction Techniques for Virtual Environments
Why do we need good user interfaces?. Goals of User Interfaces Usable – how much effort to do a task? – example: often-used buttons easier to find – example:
Human Computer Interaction © 2014 Project Lead The Way, Inc.Computer Science and Software Engineering.
 Input Devices Input Devices  Examples of Input Devices Examples of Input Devices  Keyboard Keyboard  Pointing Devices Pointing Devices Mouse Joystick.
Do these make any sense?. Navigation Moving the viewpoint as a cost of knowledge.
Chapter 10 Interacting with Visualization 박기남
KAMI KITT ASSISTIVE TECHNOLOGY Chapter 7 Human/ Assistive Technology Interface.
© 2010 Pearson Education, Inc. | Publishing as Prentice Hall1 Computer Literacy for IC 3 Unit 2: Using Productivity Software Chapter 1: Starting with Microsoft.
HCI 입문 Graphics Korea University HCI System 2005 년 2 학기 김 창 헌.
Oct 021 Outline What is a widget? Buttons Combo boxes Text components Message boxes.
1cs426-winter-2008 Notes  Will add references to splines on web page.
Human Interaction World in miniature papers. INTERACTIVE W orlds I n M iniature WIM 1.Introduction 2.System Description 3.Findings from Previous Work.
User Performance in Relation to 3D Input Device Design  Studies conducted at University of Toronto  Usability review of 6 degree of freedom (DOF) input.
Characteristics of Graphical and Web User Interfaces
Software 3 See Edmodo for images Group name: topcat Group code: i4qf9a 11/03/11.
Pen Based User Interface Issues CSE 490RA January 25, 2005.
Expressive Intelligence Studio // Center for Games and Playable Media // 3D User Interfaces Using the Kinect.
Contextual menus Provide access to often-used commands that make sense in the context of a current task Appear when the user presses the Control key while.
Over the recent years, computer vision has started to play a significant role in the Human Computer Interaction (HCI). With efficient object tracking.
Computer Input Devices: Mouse
Standard Methods of Input.
LECTURE Course Name: Computer Application
Chapter 6: Interfaces and interactions
Fundamentals of Using Microsoft Windows XP
Chapter 9 System Control
Design of Everyday Things - Donald Norman CS A470.
Presentation transcript:

VR Interfaces – Navigation, Selection and UI Elements By David Johnson

VR Interfaces How do we tell the computer to do things? How do we select things? How do we navigate around?

VR Interface Challenges Intuitive –Make interaction work like the real world –Minimize cognitive overhead Augmentation –Give users new capabilities

Quick UI review Norman’s Principles of Design –Make things visible –Provide a good conceptual model Affordance Mapping Constraints Feedback

Visibility

Good Conceptual Model A good conceptual model allows us to predict the effects of our actions Without a good model we operate blindly –Simply follow rules without understanding a reason –No understanding of cause or effect –No recourse when something breaks Fridge/freezer controls Thermostat

Affordances

Mapping

Constraints Prevent you from doing what you shouldn’t do –Grey out selections that don’t apply at the current time

Feedback Examples Clicker on your turn signal Animated icon while waiting for a web page to load

Why is usability important? Poor usability results in –anger and frustration –decreased productivity in the workplace –higher error rates –physical and emotional injury –equipment damage –loss of customer loyalty

2D Interfaces Dominant computer interface uses a mouse and graphical elements Xerox Star (1981)

2D Interfaces Why is it a WIMP interface? –Windows –Icons –Menus –Pointer Xerox Star (1981)

3D Interfaces Need to map 2D interfaces to 3D Hopefully, create whole new expressive interfaces

3D equivalent of a Mouse? Mouse –2D positioning –Buttons to hold or click

6DOF mouse Flying mouse Fledarmaus The Bat How do you clutch/ratchet? –In 2D, picking up disables tracking

Menus in Virtual Space Cannot easily overlay menus “Float” menus in space –Select by raycasting –Keep near user’s head Jacoby, Ellis 1992

Menus in Virtual Space Ring Menus –JDCAD 1993 Liang –ISAAC 1995, Mark Mine –Rotate hand to move selection point

Menus in Virtual Space Darken, 1994 Overlaid menus Speech selection

Menus in Virtual Space Pen and Tablet –Track a tablet and pen –Put 2D menus on tablet –“Haptic Hand”

Menus in Virtual Space Bowman Pinch Gloves –Select with thumb to finger –High-level menu on ND hand –Secondary menu on D hand –First tries Scrolling menu using pinches or More items on pinkie

TULIP Menus Three-Up, Labels in Palm Virtually raise hands Rotate menus Put ‘next’ groups on palms Users preferred over floating and tablet menus –Perhaps slower

Menus in Virtual Spaces ‘Virtual tricorder’ Wloka 1995

Gestures Symbolic –Cultural meaning (O.K. sign) Deictic –Pointing, direct viewer’s attention Iconic –Showing an example path with hand Pantomimic –Act out the activity

Gestures GIVEN (1992) –Neural net recognition –20 gestures Fly, grab, etc. Mine –“Physical mnemonics” Pull-down menus from near head Delete by throwing over shoulder

Numerical Input Mark Mine A digit at a time Sliders too imprecise

Text Input Bowman Pinch glove Thumb to home row finger Hands in/out to go down/up row Rotate to hit extra keys

Basic Navigation Tasks Exploration –Untargeted movement –Build internal map Positioning –Move to known location Maneuvering –Precise positioning of viewpoint –Typically short motions

Natural Interfaces Walking Biking Snowboard Swimming Issues?

Walking workaround Redirected walking –moviemovie

Flying Interfaces Flying Magic carpet Guided navigation –River analogy Issues?

Steering Interfaces Pointing –Expressive –Hand shake Torso Gaze-directed –simple Physical device

Hand-based Interfaces Colin Ware (1990s) –World-in-hand –Eye-in-hand

Point-to-point Travel Select a point in a scene –Computer picks path Teleport –Bowman et al. found significant spatial disorientation from teleport

World in Miniature User holds dynamic map in one hand Navigation is reduced to object positioning

WIM Setup Physical props – clipboard and interface ball

Two-handed Flying Mark Mine

Fundamental Operations in a UI Select an object Manipulate an object –Translate –Rotate What are some techniques in 2D interfaces?

From the Beginning Sutherland and Vickers –Sorcerer’s Apprentice (1972) Track stylus Selection of vertices –Intersection of cube at tip of stylus

Pointing: Put That There 1979 Ray from tracked hand Speech interface movie

Pointing: JD-CAD (1993) “Laser gun” from hand –Tracker noise –Harder to select far away Spotlight –Add a cone to the ray –Select objects based on Distance from cone axis Distance from hand

Silk Cursor Replace wireframe selection box with translucent box –Visual cues to containment

Pointing: Aperture Spotlight from eye Cone angle based on distance from hand to eye Selection modified by hand orientation

Pointing: Flexible Pointer Two-handed Hand direction bends pointer Can select occluded objects movie

Hand: GoGo Interaction (1996) Go-Go uses Non-Linear mapping between virtual and real hand Control-display ratio Stretch go-go variation Pros: –Extended reach when needed –Direct manipulation Cons: –Reach still limited by arm length –Precision suffers when reach is extended (low level of control) Movie

Image Plane Techniques Point or gesture at an objects projection onto the viewing plane –“head-crusher” Kids in the hall –“Sticky finger” Similar to ray casting Pros: –Very intuitive –Allows user to reach objects at an arbitrary distance Cons: –Limited by the need for line of sight –Can be fatiguing –Virtual hand may obscure small objects

Two Handed and Body-Centered Interaction What can you do with two hands? What if you use your body as a reference point? Mine, Mark, Frederick P. Brooks Jr., and Carlo Sequin (1997). Moving Objects in Space: Exploiting Proprioception in Virtual-Environment Interaction. Proceedings of SIGGRAPH 97, Los Angeles, CA.Moving Objects in Space: Exploiting Proprioception in Virtual-Environment Interaction. “Scaled-space grab”

HOMER technique Hand-Centered Object Manipulation Extending Ray-Casting Select: ray-casting Manipulate: hand Translation proportional to initial object distance Time

World in Miniature User holds dynamic map in one hand Objects can be moved in map What about fine positioning? What about selection of small objects?

Voodoo Dolls User creates map with image plane selection

Summary Ergonomics an issue Usability still low Standard GUI elements translate poorly