Creating Virtual Reality Applications Using FreeVR David J. Zielinski Friday Visualization Forum October 1 st,2004.

Slides:



Advertisements
Similar presentations
SEMINAR ON VIRTUAL REALITY 25-Mar-17
Advertisements

Virtual Reality Applications Pablo Figueroa Computing Science Department University of Alberta.
EMS1EP Lecture 6 Digital Inputs
A Natural Interactive Game By Zak Wilson. Background This project was my second year group project at University and I have chosen it to present as it.
Concept V2.5 Lesson 11 Objectives: After completing this lesson, the learner will be able to:  Define the configuration rules associated with the Quantum.
CS 352: Computer Graphics Chapter 7: The Rendering Pipeline.
1. 2 LabVIEW for FRC Doug Norman National Instruments January 6, 2012.
1 Computer Graphics Chapter 2 Input Devices. RM[2]-2 Input Devices Logical Input Devices  Categorized based on functional characteristics.  Each device.
Virtual Reality Design Virtual reality systems are designed to produce in the participant the cognitive effects of feeling immersed in the environment.
Slide 1 Tiled Display Walls - Relation to the Access Grid and Other Systems Mike Walterman, Manager of Graphics Programming, Scientific Computing and Visualization.
BPC: Art and Computation – Fall 2006 Introduction to virtual environments Glenn Bresnahan
Game Design and Programming. Objectives Classify the games How games are design How games are implemented What are the main components of a game engine.
Virtual Reality. What is virtual reality? a way to visualise, manipulate, and interact with a virtual environment visualise the computer generates visual,
User Interface Design: Methods of Interaction. Accepted design principles Interface design needs to consider the following issues: 1. Visual clarity 2.
REQUIREMENTS FOR MULTIMEDIA PROCESSING MULTIMEDIA SYSTEMS IREK DEFEE.
Ch 7 & 8 Interaction Styles page 1 CS 368 Designing the Interaction Interaction Design The look and feel (appearance and behavior) of interaction objects.
Further Programming for 3D applications CE Introduction to Further Programming for 3D application Bob Hobbs Faculty of Computing, Engineering and.
Virtual Reality at Boston University Glenn Bresnahan Boston University Scientific Computing and Visualization (
17-Oct-02 Creating Educational Objects Using QuickTime TM James E. McClurg & James D. Myers Department of Geology and Geophysics University of Wyoming.
Based on slides created by Edward Angel
Virtual Reality Virtual Reality involves the user entering a 3D world generated by the computer. To be immersed in a 3D VR world requires special hardware.
Hardware Specialised Devices
SE320: Introduction to Computer Games Week 8: Game Programming Gazihan Alankus.
Introduction to Virtual Environments CIS 4930/6930
Introduction to Graphics and Virtual Environments.
Virtual Reality RYAN TAYLOR. Virtual Reality What is Virtual Reality? A Three Dimension Computer Animated world which can be interacted with by a human.
 Introduction  Devices  Technology – Hardware & Software  Architecture  Applications.
Knowledge Systems Lab JN 8/24/2015 A Method for Temporal Hand Gesture Recognition Joshua R. New Knowledge Systems Laboratory Jacksonville State University.
1 Perception, Illusion and VR HNRS 299, Spring 2008 Lecture 19 Other Graphics Considerations Review.
Control Systems An Overview. Definition A control system is a device or set of devices that are coordinated to execute a planned set of actions.
There are different types of translator. An Interpreter Interpreters translate one instruction at a time from a high level language into machine code every.
Zhonghua Qu and Ovidiu Daescu December 24, 2009 University of Texas at Dallas.
Introduction to the Basic Parts of LEGO’s NXT Robotics
Cthru Technical Brief Gary Morris Center of Higher Learning Stennis Space Center.
Virtual Reality David Johnson. What is Virtual Reality?
COMPUTER GRAPHICS Hochiminh city University of Technology Faculty of Computer Science and Engineering CHAPTER 01: Graphics System.
Miguel Tavares Coimbra
VIRTUAL REALITY Sagar.Khadabadi. Introduction The very first idea of it was presented by Ivan Sutherland in 1965: “make that (virtual) world in the window.
Virtual Reality Lecture2. Some VR Systems & Applications 고려대학교 그래픽스 연구실.
GENESIS OF VIRTUAL REALITY  The term ‘Virtual reality’ (VR) was initially coined by Jaron Lanier, founder of VPL Research (1989)..
Virtual Reality and ALICE By Bjørn S. Nilsen The Ohio State University On behalf of Dennis Sessanna, Ohio Supercomputing Center.
1 Computer Graphics Week2 –Creating a Picture. Steps for creating a picture Creating a model Perform necessary transformation Lighting and rendering the.
The Effects of Immersion and Navigation on the Acquisition of Spatial Knowledge of Abstract Data Networks James Henry, M.S. Nicholas F. Polys, Ph.D. Virginia.
Neo-Breakout Sonhui Schweitzer CS 470 Spring 2005.
Electronic Visualization Laboratory University of Illinois at Chicago Tele-immersive Cranial Implant Design Chris Scharver September 12, 2001
WPH310. Free Phone! Windows Phone 7 as a Game Platform Free Tools to Build Great Games The Game Loop Input on Windows Phone 7 Building a Game with Your.
HCI 입문 Graphics Korea University HCI System 2005 년 2 학기 김 창 헌.
Interface Opportunities for 3D Data and Media User Interface Software TechNote Panel Virginia Tech CS Fall 2002 Nicholas F. Polys Umur Yilmaz Will Lee.
1 Perception and VR MONT 104S, Fall 2008 Lecture 14 Introduction to Virtual Reality.
Multi-Node Real Time Flight Simulator (Outline of the topic, Oct/26/2004) Donghyuk Jeong Aerospace Eng.
Computer Graphics: Programming, Problem Solving, and Visual Communication Steve Cunningham California State University Stanislaus and Grinnell College.
Immersive Displays The other senses…. 1962… Classic Human Sensory Systems Sight (Visual) Hearing (Aural) Touch (Tactile) Smell (Olfactory) Taste (Gustatory)
Vizard Virtual Reality Toolkits Vizard Virtual Reality Toolkits.
TELE IMMERSION AMAN BABBER
Haris Ali (15) Abdul Ghafoor (01) Kashif Zafar (27)
Made By: Pallavi Chhikara
Building Virtual Environments that are Reconfigurable, Scalable, Extensible Lance Arsenault John Kelso University Visualization and Animation Group
August 21, 2002 Command and Control Visualization NAVCIITI Tasks 2.1a & 2.1b.
Project Information Abstract Project Objectives The objective of this project is to: Create a visual designer that will allow inexperienced end- users.
VR software and platform Dr. Nan WANG Dr. Ronan BOULIC EPFL Immersive Interaction Group.
What you need: In order to use these programs you need a program that sends out OSC messages in TUIO format. There are a few options in programs that.
What is Multimedia Anyway? David Millard and Paul Lewis.
Electronic Visualization Laboratory University of Illinois at Chicago Programming the Personal Augmented Reality Immersive System (PARIS) Chris Scharver.
Mixed Reality Benjamin Lok.
5/5/2018 1:12 AM © Microsoft Corporation. All rights reserved. MICROSOFT MAKES NO WARRANTIES, EXPRESS, IMPLIED OR STATUTORY, AS TO THE INFORMATION IN THIS.
Human Computer Interaction (HCI)
CHAPTER 8 Multimedia Authoring Tools
Virtual Reality.
AN INTRODUCTION TO COMPUTER GRAPHICS Subject: Computer Graphics Lecture No: 01 Batch: 16BS(Information Technology)
Switching & Controlling LoB
Presentation transcript:

Creating Virtual Reality Applications Using FreeVR David J. Zielinski Friday Visualization Forum October 1 st,2004

Overview 1) What is Virtual Reality? 2) What are my choices? 3) Why use FreeVR? 4) FreeVR program outline 5) Application demos

1) What is Virtual Reality?

What is Virtual Reality? ● Physically immersive (generate input to user's sensory systems) ● Interactive (user can change the experience) Examples: reading a book (not interactive or immersive) video game (interactive but not immersive) 3-d movie (immersive but not interactive)

What can we use for physically immersive? ● 3-d vision – projection systems: [anaglyphic (red/blue), polarized, active stereo] – head mounted displays.

What can we use for physically immersive? ● 3-d sound – multichannel speaker setups – Headphone simulation

What can we use for physically immersive? ● Other: – Touch/force feedback – Taste – Smell – Environmental conditions (temperature) – Accelerations/motion – Direct connection to human nervous system

What can we use for interactive? location tracking systems: magnetic, video, ultra-sonic

What can we use for interactive? joysticks: wands, game controllers other: gloves, eye tracking, biological indicators (heart-rate, breathing), props (steering wheels, fishing poles)

2) What are my choices for developing an application/experience? A) Write from scratch B) Low-level library C) Medium-level library (scenegraph) D) High-level graphical program E) Use domain-specific, off the shelf program

Write from scratch ● Pro: – complete control – maximum performance/speed possible ● Con: – Long, difficult development – Often device dependent

Low-level library (CAVELib, FreeVR, DIVERSE) ● Pro: – lots of control (C/C++,OpenGL programming) – device independent ● Con: – difficult for non-programmers, – more control/freedom than most applications require

Medium-level library (OpenSceneGraph, Performer)

● Pro: – built in algorithms to cull (eliminate) non-viewable objects – built in algorithms for collision detection ● Con: – Still lots of programming – learning curve to use scenegraph format – usually still low-level library dependent

High-level graphical program (Virtools)

● Pro: – Easy visual design of worlds – Visual, flow design for object behaviors ● Con: – Expensive – speed/performance? – Still have to learn to “program”

Use domain-specific, off the shelf program (Amira, VMD)

● Pro: – Very easy, especially if group is already using non-VR version of program ● Con: – Individual configuration – not applicable to making custom applications

3) Why use FreeVR? ● device independence (input and output) ● free/opensource (CAVELib is not) ● actively developed (cluster support soon) ● I know how to configure it

How does it become device independent? ● configuration file (.freevrrc) maps buttons, sensors, and screen locations to FreeVR internals. Different physical configurations require different.rc settings. ● on startup FreeVR opens all necessary windows, and handles projection matrices, so correct view is on correct screen.

abstraction of buttons: int pressed=vrGet2switchValue(which_button) ; if(pressed) printf(“user pressed the button!”);

abstraction of location sensors: vrPointGetRWFrom6sensor(&wand_locpnt, WAND_SENSOR); float x,y,z; x=wand_locpnt.v[0]; y=wand_locpnt.v[1]; z=wand_locpnt.v[2]; if((x<...) AND (y<...) AND (z<...)) /* we are touching an object, do something */

abstraction of display space: we provide single function, that is called for each frame, for each render window. void draw_world() { clear_screen glTranslatef(0,5,-5); /* real world coordinates draw_object }

4) FreeVR program Outline pt.1 Main(){ initialization calls setup callback of world render function setup callback of gfx initialization while(!terminate){ vrFrame(); update(); /* next slide */ }

FreeVR program Outline pt.2 Update(){ check buttons check sensor positions do logic } draw_world(){ look at world settings (modified in update) make openGL calls to render world }

FreeVR programming Subtleties ● Rendering and updates occur in separate processes, so shared memory is needed. FreeVR includes vrShmemAlloc0 which is comparable to malloc. Typically a world data structure is created, modified, and passed around. ● because of the separate processes we need to lock the world data when we are writing and reading it. FreeVR includes vrLock. We can call vrLockReadSet(the_lock); vrLockReadRelease(the_lock);

5) Applications –PDB viewer

Virtual Vibraphone

FreeVR Download David J. Zielinski