Francois Rioux Frank Rudzicz Mike Wozniewski HCI Project Presentation McGill University, April 6 2004.

Slides:



Advertisements
Similar presentations
Chapter 11 Designing the User Interface
Advertisements

An Iterative Approach to Interface Design April 8 th, 2004 Bartini.
Regis Kopper Mara G. Silva Ryan P. McMahan Doug A. Bowman.
Digital da Vinci Presented by: Navin Harjani Sikander Morad Christopher Sun.
VIEW Mike Brundage Ivan Sopin. Overview of X3D X3D is an open standards file format and run-time architecture to represent and communicate 3D scenes and.
1 CS 501 Spring 2002 CS 501: Software Engineering Lecture 11 Designing for Usability I.
OOP Design Patterns Chapters Design Patterns The main idea behind design patterns is to extract the high level interactions between objects and.
The Microsoft View: Module 1: Getting Started. Copyright Course 2559B, Introduction to Visual Basic®.NET Programming with Microsoft®.NET. Lecture 1 Microsoft.
7M701 1 User Interface Design Sommerville, Ian (2001) Software Engineering, 6 th edition: Chapter 15
User Interface Design: Methods of Interaction. Accepted design principles Interface design needs to consider the following issues: 1. Visual clarity 2.
Ch 7 & 8 Interaction Styles page 1 CS 368 Designing the Interaction Interaction Design The look and feel (appearance and behavior) of interaction objects.
About the Presentations The presentations cover the objectives found in the opening of each chapter. All chapter objectives are listed in the beginning.
INTRODUCTION. Concepts HCI, CHI Usability User-centered Design (UCD) An approach to design (software, Web, other) that involves the user Interaction Design.
Chapter 13: Designing the User Interface
Design, goal of design, design process in SE context, Process of design – Quality guidelines and attributes Evolution of software design process – Procedural,
Chapter 1 The Challenges of Networked Games. Online Gaming Desire for entertainment has pushed the frontiers of computing and networking technologies.
CS 352, W12 Eric Happe, Daniel Sills, Daniel Thornton, Marcos Zavala, Ben Zoon ANDROID/IOS RPG GAME UI.
Sixth Sense Technology. Already existing five senses Five basic senses – seeing, feeling, smelling, tasting and hearing.
CSE 380 – Computer Game Programming Introduction ITS 102 – 3D Modeling for Games Blender's User Interface.
Chapter 11: Interaction Styles. Interaction Styles Introduction: Interaction styles are primarily different ways in which a user and computer system can.
HagIT: The future is in your hands Designed and presented by: Hamza Khurshid Ahmad Ghunaim Ghassan Knayzeh.
Mouse-Based Viewing & Navigation Glenn G. Chappell U. of Alaska Fairbanks CS 381 Lecture Notes Monday, November 3, 2003.
1 SWE 513: Software Engineering Usability II. 2 Usability and Cost Good usability may be expensive in hardware or special software development User interface.
Andy Wilson - GLUT and GLVU - 9/99 - Slide 1 GLUT and GLVU Andy Wilson September 22, 1999.
Multimedia Specification Design and Production 2013 / Semester 2 / week 8 Lecturer: Dr. Nikos Gazepidis
An Introduction to Software Architecture
Department of Mechanical Engineering, LSUSession VII MATLAB Tutorials Session VIII Graphical User Interface using MATLAB Rajeev Madazhy
User interface design. Recap OOD is an approach to design so that design components have their own private state and operations Objects should have constructor.
CSC 480 Software Engineering Lecture 19 Nov 11, 2002.
Lecture 6 User Interface Design
Robotics Simulation (Skynet) Andrew Townsend Advisor: Professor Grant Braught.
V part C.  Different programs have different ways of displaying objects in real-time while the scenes are being created within the workspace. Some.
Web Games Programming An Introduction to Unity 3D.
GENESIS OF VIRTUAL REALITY  The term ‘Virtual reality’ (VR) was initially coined by Jaron Lanier, founder of VPL Research (1989)..
CS 352, W12 Eric Happe, Daniel Sills, Daniel Thornton, Marcos Zavala, Ben Zoon ANDROID/IOS RPG GAME UI.

Lecture 11: Exam Revision 1  Principles of Interactive Graphics  CMSCD2012  Dr David England, Room 718,  ex 2271  Coursework.
Creating Graphical User Interfaces (GUI’s) with MATLAB By Jeffrey A. Webb OSU Gateway Coalition Member.
Do these make any sense?. Navigation Moving the viewpoint as a cost of knowledge.
Chapter 10 Interacting with Visualization 박기남
CS 352, W12 Eric Happe, Daniel Sills, Daniel Thornton, Marcos Zavala, Ben Zoon ANDROID/IOS RPG GAME UI.
Visual Analytics of User Behavior Project Description: Analyze and predict user behavior in a virtual world to inform dynamic modifications to the environment.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
Gaming ISV TOBII CONFIDENTIAL INFORMATION. Imagine a computer that knows where you want to point before you do  By looking at your point of gaze the.
Oct 021 Outline What is a widget? Buttons Combo boxes Text components Message boxes.
INTERFACE COMPOSITION GAME DESIGN. OBJECTIVES After this lesson, students will be able to: Identify the Eight Golden Rules of Human-Computer Interface.
Gesture Modeling Improving Spatial Recognition in Architectural Design Process Chih-Pin Hsiao Georgia Institute of Technology.
Virtual Reality and Digital Characters: New Modalities for Human Computer Interaction G2V2 Talk September 5 th, 2003 Benjamin Lok.
AirRacquet System Team Innovation Bjoern Doering Jonah Peranson Ali-Akber Saifee Wendy Wang.
Yonglei Tao School of Computing & Info Systems GVSU Ch 7 Design Guidelines.
Microsoft Word CERTIFICATION PREP. Lesson 1 Basic Overview RIBBON The main command interface in Microsoft office 2013 is the ribbon. The Ribbon is a centralized.
1 CS 501 Spring 2003 CS 501: Software Engineering Lecture 13 Usability 1.
Choreography Assistant Human Computer Interaction - April 6, 2004 Choreography Assistant.
UFCFSU-30-13D Technologies for the Web An Introduction to Unity 3D.
Design Visualization Software Introduction / Review.
Pen Based User Interface Issues CSE 490RA January 25, 2005.
A Framework for Perceptual Studies in Photorealistic Augmented Reality Martin Knecht 1, Andreas Dünser 2, Christoph Traxler 1, Michael Wimmer 1 and Raphael.
Interstage BPM v11.2 1Copyright © 2010 FUJITSU LIMITED FORMS.
Visualization of Three-Dimensional Geometric Models in a Stereoscopic System Rositsa Radoeva Assistant professor at St. Cyril and St. Methodius University.
6. (supplemental) User Interface Design. User Interface Design System users often judge a system by its interface rather than its functionality A poorly.
CIRP Annals - Manufacturing Technology 60 (2011) 1–4 Augmented assembly technologies based on 3D bare-hand interaction S.K. Ong (2)*, Z.B. Wang Mechanical.
Understand Windows Forms Applications and Console-based Applications
Introduction to Events
The Graphics Rendering Pipeline
Introduction to Programming
An Introduction to Software Architecture
Human Computer Interaction
Klaas Werkman Arjen Vellinga
Chapter 9 System Control
Presentation transcript:

Francois Rioux Frank Rudzicz Mike Wozniewski HCI Project Presentation McGill University, April

:: Overview  Proposed to improve visualization of 3D design.  Application is *immersive*.  Allows user to place him/herself inside the 3D world they are designing.  Uses a *two handed* gesture-based interface.  Allows for real-time rendering of display.  no need to switch between design-mode and render-mode.

:: Usage  The system could benefit many disciplines:  Computer Assisted Design (CAD).  3D game design.  Architecture, engineering, 3D layout.  Etc.

:: Usage  In our prototype, we focus on “Interior Design”.  User can place models of furniture, decorations, & appliances in a room.  User can modify these models: Rotate & Translate Apply Textures/Colors Etc.  User can save/restore various configurations.  User can invite his/her client to navigate around room to gauge a sense of the space before construction begins.

:: Design Decisions Unique Design Decisions: o Bimanual (Two Handed) Interaction o Toolglass interface widgets

:: Bimanuality  Must consider the properties of bimanual (two-handed) interaction as design constraint:  There exists structure to bimanual manipulations: asymmetry & division of labor (Guiard 1987).  Non-preferred vs. preferred hand.  Preferred hand is typically organized relative to a dynamic frame of reference provided by the non- preferred hand.  Benefits: Cognitive load lessened (Leganchuck 1998). Performance increase (Buxton & Meyers 1986). Provides additional kinesthetic feedback.

:: Toolglasses  Toolglass metaphor (Bier et al. 1993).  Semi-transparent menu.  Positioned over a target using the non-preferred hand.  Preferred hand clicks “thru” the menu to apply an operation to the target.

:: Toolglasses  Toolglasses were chosen and designed as pie menus with handles.  Provided the affordance that one could “grab” them.  Provided the affordance of crosshair-like targeting of objects behind the toolglass. Toolglass Proposed by Bier et al. Our Concept

:: Design Decisions  Other design decisions:  Toolglass rack – instead of one toolglass, have many and separate similar tasks among them.  3DS models – allow users to import the common.3ds (3D Studio Max) file format.  High-level modelling – users perform high-level tasks such as placing 3D models, applying textures/colors etc. (As opposed to constructing models and scenes from low-level primatives).

:: Initial Prototype  Coded in OpenGL.  Allowed modifications (coloring, scaling, rotation, translation) of GLUT primitives (sphere, cube, torus, teapot).

:: Prototype Evaluation Suggested changes:  Add visual feedback to show which model is currently targeted.  Add yes/no confirmation for irreversible actions (such as the reset command).  Add undo/redo.  Implement collision detection so models and user cannot pass through the walls of the world.  Instead of translating models with toolglasses, allow user to simply ‘grab’ the model and move it.  Differentiate ‘system’ toolglasses from ‘model’ toolglasses.

:: Alpha System  Many of the above changes were implemented.

:: Alpha System  Different toolglass handles:  Included support for importing.3DS (3D Studio Max) models, although texture support was buggy and disabled for the release: Modification toolglassesSystem toolglasses

:: Alpha System Also:  Added XML config files to configure toolglasses and options without need for recompiling. Adds flexibility to the design.  Added collision detection with walls. Added undo/redo:Added file saving:Added sub-toolglasses:

:: Beta System  Deployed in the SRE immersive environment, with gesture tracking.

:: Beta System  Added support for gesture tracking:  Toolglasses allowed keeping the gesture set at a minimum Pointing: Holding arm partially extended moves virtual cursors. Selection: Fully extending non-preferred hand selects a toolglass. Selection: Fully extending preferred hand invokes a wedge. Deselection: Arms at side of body drops toolglass.  Also, Navigation gesture: offsetting ones body from the center of the environment moves the camera in that direction.

:: Beta System Improved highlighting of targeted models: Ability to apply different textures and colors to sub-components of models:

:: Beta System Clarity & visibility improvement of toolglasses in rack: Confirmation of actions Before irreversible actions:

:: Future Work  Add support for placing pictograms/textures on toolglass wedges instead of text.  Improve tracking of gestures:  Employ a robust hand tracking algorithm rather than using ‘furthest point from center’ algorithm.  Use statistical tracking methods (eg. Condensation algorithm) to make tracking less susceptible to noise.  Experiments to test benefits of the interface and interaction techniques.

:: HCI Lessons  Visibility of system state & feedback was very important for users.  ie, knowing which model they were targeting, which toolglass wedge was selected, etc.  User feedback early in the process saved lots of time in the long term.  Eg: An initial user was concerned about being able to click on a wedge. We now use angular targeting, which was simpler to code and thus saved time.  A major challenge to gesture-based interfaces is tracking / recognizing the gestures.  Currently, we feel our design is robust yet poor gesture tracking makes it almost unusable.  It is important to prototype the physical aspects of a system.  It is quite physically demanding to keep one’s arms outstretched for a long time. Our prototype used a mouse and keyboard, so no users never tried performing our gesture set for an extended period of time.