Human Interaction World in miniature papers. INTERACTIVE W orlds I n M iniature WIM 1.Introduction 2.System Description 3.Findings from Previous Work.

Slides:



Advertisements
Similar presentations
SEMINAR ON VIRTUAL REALITY 25-Mar-17
Advertisements

(1) Scalable WIM by Li, Fu and Hanson Yinggang Li *, Chi-Wing Fu +, Andrew Hanson * Department of Computer Science * Indiana University + Hong Kong University.
Regis Kopper Mara G. Silva Ryan P. McMahan Doug A. Bowman.
User Interfaces By Mathieu Leduc. What is the User Interface(UI)? Knows about any input/output hardware Translates player actions into actions in the.
School of Computer Science and Software Engineering Design Issues in Human Visual Perception Experiments on Region Warping Monash University Yang-Wai Chow.
 INTRODUCTION  STEPS OF GESTURE RECOGNITION  TRACKING TECHNOLOGIES  SPEECH WITH GESTURE  APPLICATIONS.
1 Reactive Pedestrian Path Following from Examples Ronald A. Metoyer Jessica K. Hodgins Presented by Stephen Allen.
Virtual Reality and Scientific Visualization in Archaeological Research.
Exploring the Use of Passive Haptics in Redirected Walking-based Virtual Environments Luv Kohli COMP239 April 20, 2005 Final Project Update.
The Process of Multiplatform Development: An Example Robyn Taylor University of Alberta.
Motion Detection And Analysis Michael Knowles Tuesday 13 th January 2004.
VR Interfaces – Navigation, Selection and UI Elements By David Johnson.
School of Computer Science and Software Engineering A Networked Virtual Environment Communications Model using Priority Updating Monash University Yang-Wai.
Selected Topics in 3D User Interfaces Joseph J. LaViola Jr. CS March 6, 2006.
Using Tweek to Create Graphical User Interfaces in Virtual Reality Patrick Hartling IEEE VR 2003.
Gyration GyroMouse. Digitizers 3D Digitization Data Gloves (Haptic Devices)
Lecture 5: Interaction and Navigation Dr. Xiangyu WANG Acknowledge the notes from Dr. Doug Bowman.
3D Interaction Techniques for Virtual Environments
1 Gestural and Bimanual Input Doantam Phan CS 376 Discussion.
Scenes, Cameras & Lighting. Outline  Constructing a scene  Using hierarchy  Camera models  Light models.
Discussion Silvia Lindtner INF 132 April 07. Fitts’ law - recap A predictive model of time to point at an object Help decide the location and size of.
User Interface Development Human Interface Devices User Technology User Groups Accessibility.
2.03B Common Types and Interface Devices and Systems of Virtual Reality 2.03 Explore virtual reality.
 Introduction  Devices  Technology – Hardware & Software  Architecture  Applications.
June 12, 2001 Jeong-Su Han An Autonomous Vehicle for People with Motor Disabilities by G. Bourhis, O.Horn, O.Habert and A. Pruski Paper Review.
SPACE MOUSE.
1 Perception, Illusion and VR HNRS 299, Spring 2008 Lecture 19 Other Graphics Considerations Review.
Virtual Museum Ramón Oliva, Miguel Pasenau & Eugeni Casadesús 2012, January 24th Virtual Reality, Immersive Interaction, Usability and Presence (RVA) course.
2.5/2.6/2.7.  Virtual Reality presents a world in 3d space  Regular input devices such as a mouse only has 2 degrees of movement when 6 is needed for.
Interacting with Visualization Colin Ware, Information Visualization, Chapter 10, page 335.
VE Input Devices(I) Doug Bowman Virginia Tech Edited by Chang Song.
Point-to-GeoBlog: Gestures and Sensors to Support User Generated Content Creation Simon Robinson, Parisa Eslambolchilar, Matt Jones MobileHCI 2008.
Designing 3D Interfaces Examples of 3D interfaces Pros and cons of 3D interfaces Overview of 3D software and hardware Four key design issues: system performance,
Oct 30, 2006 LUONNOS Navigation techniques for construction industry product models Jukka Rönkkö, HUT/VTT
Multi-device Organic 3D Sculpting through Natural User Interface Gestures BSci Honours - Bradley Wesson Supervisor – Brett Wilkinson.
VIRTUAL REALITY (VR) INTRODUCTION AND BASIC APPLICATIONS الواقع الافتراضي : مقدمة وتطبيقات Dr. Naji Shukri Alzaza Assist. Prof. of Mobile technology Dean.
Jessica Tsimeris Supervisor: Bruce Thomas Wearable Computer Lab
COMP106 Assignment 2 Proposal 1. Interface Tasks My new interface design for the University library catalogue will incorporate all of the existing features,
Dr. Gallimore10/18/20151 Cognitive Issues in VR Chapter 13 Wickens & Baker.
VIRTUAL REALITY (VR) INTRODUCTION AND BASIC APPLICATIONS الواقع الافتراضي : مقدمة وتطبيقات Dr. Naji Shukri Alzaza Assist. Prof. of Mobile technology Dean.
Virtual Reality Lecture2. Some VR Systems & Applications 고려대학교 그래픽스 연구실.
GENESIS OF VIRTUAL REALITY  The term ‘Virtual reality’ (VR) was initially coined by Jaron Lanier, founder of VPL Research (1989)..
3D Interaction Techniques for Virtual Environments
Gaze-Controlled Human-Computer Interfaces Marc Pomplun Department of Computer Science University of Massachusetts at Boston Homepage:
Do these make any sense?. Navigation Moving the viewpoint as a cost of knowledge.
Chapter 10 Interacting with Visualization 박기남
1 Perception and VR MONT 104S, Fall 2008 Lecture 21 More Graphics for VR.
2.03 Explore virtual reality design and use.
Hands-Free Camera Controller Jeffrey Gould. Overview Introduction –Background –Design Criteria Components Sensor Mapping Problems Demonstration Future.
Fundamentals of Information Systems, Third Edition1 The Knowledge Base Stores all relevant information, data, rules, cases, and relationships used by the.
VIRTUAL REALITY PRESENTED BY, JANSIRANI.T, NIRMALA.S, II-ECE.
Pad++: A Zooming Graphical Interface for Exploring Alternate Interface Physics Benjamin B. Bederson and James D. Hollan Presented by Daniel Schulman.
User Performance in Relation to 3D Input Device Design  Studies conducted at University of Toronto  Usability review of 6 degree of freedom (DOF) input.
Team Members Ming-Chun Chang Lungisa Matshoba Steven Preston Supervisors Dr James Gain Dr Patrick Marais.
Haris Ali (15) Abdul Ghafoor (01) Kashif Zafar (27)
Made By: Pallavi Chhikara
Expressive Intelligence Studio // Center for Games and Playable Media // 3D User Interfaces Using the Kinect.
PRESENTED BY : DIPTIMAYEE PADHIHARI  Introduction to wearable computers  Aim of wearable computing  History of wearable computers  Wearable computing.
GRAPHICS DISPLAY INTERFACES
Introducing virtual REALITY
Jun Shimamura, Naokazu Yokoya, Haruo Takemura and Kazumasa Yamazawa
Real-time Wall Outline Extraction for Redirected Walking
3.03 Explore virtual reality design and use.
NBKeyboard: An Arm-based Word-gesture keyboard
Improvements on a Novel Hybrid Tracking System
Virtual Reality.
The Implementation of a Glove-Based User Interface
Klaas Werkman Arjen Vellinga
Chapter 9 System Control
Presentation transcript:

Human Interaction World in miniature papers

INTERACTIVE W orlds I n M iniature WIM 1.Introduction 2.System Description 3.Findings from Previous Work 4.WIM Interaction Techniques 5.WIM Interaction Techniques 6.Visualisation & Multiple WIMS 7.Manipulating the WIM 8.Future Work Implementation Issues

Introduction World In Miniature (WIM) is a user interface What is a WIM?: –a Hand Held 3D Miniature Version of the Virtual Environment within the Virtual Environment. –addresses Real World Constraints –places the world into the users immediate reach –has the ability to allow interaction as a user interface: –Allows the selection & manipulation of objects –allows Navigation, Path Planning, Visulisation and multiple points of view

System Description A head mounted display (HMD) provides the user with an immersive point of view, with 6 degrees of freedom. A clipboard is placed in the users non dominant hand. The clip board acts as the floor and provides an aerial view. Clipboard graphics are miniature The users other hand holds a tennis ball with two buttons and a sensor built in, this is used for object selection in the WIM

Findings from Previous Work Object manipulation 1:1 mapping translations and rotations easily understood. User hands were not submersed in same physical space as graphics. 3DM only provided 1 point of view, scaling by user was disorientating Navigation (movement through 3D Space) Darken explored 2D WIM maps which since have been extended into 3D WIM Maps using a scene in hand metaphor. The NASA view took the concept and used 2D viewports to jump from one virtual environment to another. Object Selection Techniques such as Ray Casting and selection cones have been tried. The main problem with this is object occlusion.

WIM Interaction Techniques The WIM can change its point of view of the scene rapidly, this benefits the interaction allowing a quick response to occlusions. WIM fly by does not destroy point of view established. Object Selection overcoming Range and Occlusion. The object can be manipulated at different scales. Using a 3D magnifying glass the objects can be displayed at greater than 1:1 scale.

WIM Interaction Techniques Current implementation of rotating the WIM by ratcheting. Flying is the most common technique for Navigation. The ability to represent the user in the WIM. Updating after Manipulation: 1.Immediate 2.Post-Facto 3.Batch

Visualisation & Multiple WIMS Visualisation 1.Spatially locating and orientating the user 2.Path Planning 3.History 4.Measuring Distances 5.Viewing Alternative Representations Multiple WIMS Demonstrate widely Separated regions of the same environment.

Manipulating the WIM Initially didn’t use props –No haptic feedback –Users ‘contorted into uncomfortable positions’ Separately tracked clipboard and tennis ball –Easy two handed rotation –Got haptic feedback –Tennis ball allows easy 3D rotation of selected objects –Problems: Fatigue, Props get in the way.

Future Work & Implementation Issues WIM is faster at Implementation than virtual hand. Scrolling Clipping Zooming Multiple Participants 3D design for architects 2D Widgets in VR still plausible. Does not translate to CAVE system.

Paper 2 – Hands Free Multi-Scale Navigation in Virtual Environments LaViola, Feliz, Keefe, Zeleznick Presents three hands free interaction techniques in detail Discusses previous work related to navigation in Virtual Environments Translates the WIM concept from a HMD to a ‘CAVE like’ system Incorporates feet and torso movements

Other work in this area SmartScene™ Uniport & Omni-Directional Treadmill Head Directed Navigation SICS - Par Hansson / Anders Wallberg

SmartScene™ Two handed interface Using SpaceGrips™ Novices learn quickly Experts more productive than with keyboard and mouse Comfortable/Ergonomic Not wireless

Darken - Uniport (1994) Similar to Unicycle Maps physical exertion to movement Small movements difficult Direction of motion difficult to control

Darken – OmniDirectional Treadmill Mechanical Two layers of rollers interact to give motion in all directions on the floor plane Servo motors return user to centre of plane Difficult to master due to feedback problems – quick movements should be avoided Potential negative learning effect – doesn’t accurately reflect real walking

Treadmill 2

Fuhrmann (1997) – Head directed motion

Cyber Monkey! Tracker on head and hand Microphone in ear Speaker in mouth Virtual ‘guide’

The Step WIM Overview Invoking and Dismissing Scaling Locomotion by Leaning Auto Rotation Summary and future work

The Step WIM - Overview Miniature world map on the floor Use for overview or navigation Scale can be changed

The Step WIM – Invoking and Dismissing Controlled using foot gestures for invoking, scaling, dismissing Desired action indicated by current state and direction of user gaze Most natural and recognizable foot gesture: heel taps Detected using "Interaction Slippers"

The Step WIM – Invoking and Dismissing 2 Alternative interface: upward bounce Detected using a waist tracker (tethered) Monitor changes in waist height by more than 1.5" and time held for ΔhΔh

The Step WIM - Scaling Foot-based method: heel click to enter/leave scaling mode Head position projected onto Step WIM stored as centre of scale Walking changes centre of scale and WIM size Alternative interface: tip-toes/crouching

The Step WIM – Locomotion by Leaning Navigate smaller distances (in the virtual world) by leaning in desired direction Can also lean to move a larger Step WIM around Waist and head tracked, project vector between them to the floor.

The Step WIM – Locomotion by Leaning 2 More sensitive near the walls Modify speed according to head orientation: people generally focus on where they want to go => decrease speed as head tilts downwards to maintain focus

The Step WIM – Auto Rotation Allows full 360 degrees of environment to be viewed on three walls Use amplified rotation - but without causing cybersickness! Prototypes:  linear mapping counter-rotating world to head movement so that 120- >180 - caused some sickness  use torso rotation - eliminates unnatural rotation when just moving head, but still felt unnatural  non-linear mapping, used only when user has rotated beyond a threshold Combine with leaning to travel in directions originally behind the user

Razzaque – Redirecting Walking Similar to auto-rotation - but better! Takes advantage of the fact that a blindfolded person walks in an arc Subtle increases in rotations allow the environment to be imperceptibly reoriented Does not cause motion sickness!

The Step WIM – Summary Advantages:  completely hands-free, allowing easier manipulation etc.  can navigate large distances easily using the Step WIM, which can be scaled as desired  can navigate smaller distances by leaning, independent of head orientation Disadvantages:  requires belt and possibly slippers to be worn as well as headset  auto-rotation can be disorienting

The Step WIM – Possible Developments Modify to use only head tracking (and perhaps slippers) Improve leaning detection by putting pressure sensors in slippers Additional hands-free controls Include walking in place technique, speech recognition

WIM Overall Conclusions Multiple points of view Multiple scales Easy selection at a distance Easy to learn Easy to travel long distances Not suitable for all applications (medical)