Immersion, Prescence, Distributed VR Bob Hobbs Staffordshire University Computing School.

Slides:



Advertisements
Similar presentations
HAPTICS.
Advertisements

SEMINAR ON VIRTUAL REALITY 25-Mar-17
Advanced Programming for 3D Applications CE Bob Hobbs Staffordshire university Introduction to Human Motion Lecture 2.
Outline: Introduction Link Description Link-Connection Description
Immersion, Prescence, Distributed VR Bob Hobbs Staffordshire University Computing School.
Kinematics & Grasping Need to know: Representing mechanism geometry Standard configurations Degrees of freedom Grippers and graspability conditions Goal.
Animation Following “Advanced Animation and Rendering Techniques” (chapter 15+16) By Agata Przybyszewska.
3D Graphics for Game Programming (J. Han) Chapter XI Character Animation.
IE 447 COMPUTER INTEGRATED MANUFACTURING CHAPTER 9 Material Handling System 1 IE CIM Lecture Notes - Chapter 9 MHS.
Character Setup Character Setup is the process of creating handles and controls for anything that a character animator will need to adjust in order to.
CSCE 641: Forward kinematics and inverse kinematics Jinxiang Chai.
1Notes  Handing assignment 0 back (at the front of the room)  Read the newsgroup!  Planning to put 16mm films on the web soon (possibly tomorrow)
1Notes  Assignment 0 marks should be ready by tonight (hand back in class on Monday)
Introduction to Robotics In the name of Allah. Introduction to Robotics o Leila Sharif o o Lecture #2: The Big.
The City College of New York 1 Dr. Jizhong Xiao Department of Electrical Engineering City College of New York Kinematics of Robot Manipulator.
Introduction to Robotics
CSCE 641: Forward kinematics and inverse kinematics Jinxiang Chai.
Virtual Reality. What is virtual reality? a way to visualise, manipulate, and interact with a virtual environment visualise the computer generates visual,
Kinematics. ILE5030 Computer Animation and Special Effects2 Kinematics The branch of mechanics concerned with the motions of objects without regard to.
CSCE 689: Forward Kinematics and Inverse Kinematics
Immersion, Prescence, Distributed VR Bob Hobbs Staffordshire University Computing School.
Computer-Based Animation. ● To animate something – to bring it to life ● Animation covers all changes that have visual effects – Positon (motion dynamic)
Introduction to Virtual Environments CISE 6930/4930
Introduction to Virtual Environments CIS 4930/6930
역운동학의 구현과 응용 Implementation of Inverse Kinematics and Application 서울대학교 전기공학부 휴먼애니메이션연구단 최광진
Velocities and Static Force
2.03B Common Types and Interface Devices and Systems of Virtual Reality 2.03 Explore virtual reality.
Introduction to Graphics and Virtual Environments.
PPT ON ROBOTICS AEROBOTICSINDIA.COM. ROBOTICS WHAT IS ROBOTICS THE WORD ROBOTICS IS USED TO COLLECTIVILY DEFINE A FIELD IN ENGINEERING THAT COVERS THE.
Definition of an Industrial Robot
Welcome to CGMB574 Virtual Reality Computer Graphics and Multimedia Department.
Computer Animation Rick Parent Computer Animation Algorithms and Techniques Kinematic Linkages.
1 Lecture 19: Motion Capture. 2 Techniques Morphing Motion Capture.
Advanced Programming for 3D Applications CE Bob Hobbs Staffordshire university Human Motion Lecture 3.
Lecture 2: Introduction to Concepts in Robotics
Introduction to Virtual Environments Slater, Sherman and Bowman readings.
Advanced Programming for 3D Applications CE Bob Hobbs Staffordshire university Application of Motion Capture Lecture 10.
Virtual Reality Lecture2. Some VR Systems & Applications 고려대학교 그래픽스 연구실.
GENESIS OF VIRTUAL REALITY  The term ‘Virtual reality’ (VR) was initially coined by Jaron Lanier, founder of VPL Research (1989)..
Haptic Interfaces Virtual Environment (week 11th seminar) Presenters: Fu Cao Marios Panayides Kenny Choo Ioannis Makris.
Robotics Sharif In the name of Allah. Robotics Sharif Introduction to Robotics o Leila Sharif o o Lecture #2: The.
1 Introduction to Virtual Environments User Interfaces and Usability Fall 09 John Quarles
CSCE 441: Computer Graphics Forward/Inverse kinematics Jinxiang Chai.
2.03 Explore virtual reality design and use.
1 Perception and VR MONT 104S, Fall 2008 Lecture 14 Introduction to Virtual Reality.
Kinematics. The function of a robot is to manipulate objects in its workspace. To manipulate objects means to cause them to move in a desired way (as.
1cs426-winter-2008 Notes  Will add references to splines on web page.
Robotics/Machine Vision Robert Love, Venkat Jayaraman July 17, 2008 SSTP Seminar – Lecture 7.
Robotics Sharif In the name of Allah Robotics Sharif Introduction to Robotics o Leila Sharif o o Lecture #4: The.
HFE 760 Virtual Environments Winter 2000 Jennie J. Gallimore
Rick Parent - CIS681 Reaching and Grasping Reaching control synthetic human arm to reach for object or position in space while possibly avoiding obstacles.
1cs426-winter-2008 Notes. 2 Kinematics  The study of how things move  Usually boils down to describing the motion of articulated rigid figures Things.
CSCE 441: Computer Graphics Forward/Inverse kinematics Jinxiang Chai.
UCL Human Representation in Immersive Space. UCL Human Representation in Immersive Space Body ChatSensing Z X Y Zr YrXr Real–Time Animation.
Simulation of Characters in Entertainment Virtual Reality.
Fundamentals of Computer Animation
Physically-Based Motion Synthesis in Computer Graphics
Character Animation Forward and Inverse Kinematics
Simulation Analysis: Estimating Joint Loads
ROBOTICS.
Introduction to Virtual Environments & Virtual Reality
ROBOTS AND ROBOTIC ARM -by lalithej VVK.
Modeling robot systems
3.03 Explore virtual reality design and use.
ROBOTICS.
Chapter XIII Character Animation
Synthesis of Motion from Simple Animations
Computer Graphics Lecture 15.
Chapter 4 . Trajectory planning and Inverse kinematics
Model of robot system Óbuda University
Presentation transcript:

Immersion, Prescence, Distributed VR Bob Hobbs Staffordshire University Computing School

Outline –Context –Immersion –Presence –Shared Environments

Virtual Reality is a Tool What it is: –Use of highly interactive real-time immersive systems to convey information What it is not : –Desktop graphics –Text based –Non-interactive –Linear

Immersion: Realisation of an Environment generates displays ideally in all sensory systems; fully encloses the participant in those displays; tracks the body, limbs, head; determines the optical, auditory... arrays as a function of head tracking Either: –displays a Virtual Body with movements as function of the tracking. (mainly with HMD) –Participant can visualise self and world (CAVE)

Virtual Body At any moment there is a position in the geometry with respect to which sensory data is generated - the egocentric self-reference position. This corresponds to the place occupied by the human actor in the environment. At the self-reference position there is a functioning VB represented by the displays.

Cave

Position Tracking Systems Polhemus Inc. ( –3Space ISOTRAK (1 sensor) –3Space FASTRAK (many sensors) Ascension Technology Corp. ( –Flock of Birds –pcBIRD –SpacePad

Trackers Calibration Dynamic errors –caused by external electromagnetic fields –can be corrected by increasing measurements frequency, synchronizing the measurements with the external field source, and filtering Static errors –caused by the field distortions due to the surrounding metal and external fields –can be corrected via trackers calibration

Calibration Table Z X true tracked

Calibration Example CAVE, FoB 4 feet from the floor 1 foot grid 4 th order polynomial fit

Interpolation True spaceTracked space 1 d 8 d V. Kindratenko, A. Bennett, “Evaluation of Rotation Correction Techniques for Electromagnetic Position Tracking Systems”, in Proc. VE 2000, pp

Data Acquisition Techniques Size and type of a calibration table depends on –Type of the calibration technique to be used –Severity of the field distortions –Required calibration quality Calibration table can be –Irregular (for high-order polynomial fit) –Regular in the true space (for interpolation) –Regular in the tracked space (for tri-linear interpolation)

Regular Grid in the True Space

An Immersive Participant A user will be head tracked Have a ‘Wand’ Stereo glasses in CAVE HMD user may have additional tracking sensors – Data Glove or Motion tracker

Data Glove Hand measurement devices must sense both flexing angles of fingers and position/orientation of wrist in real-time. typical example of hand measurement device: DataGlove from VPL Research. DataGlove consists of lightweight nylon glove with optical sensors mounted along fingers.

Each sensor: short length of fiberoptic cable, with light-emitting diode (LED) at one end and phototransistor at other end. When cable flexed, some of LED's light lost, so less light received by phototransistor. Attached to back: 3Space Isotrack system to measure orientation/position of gloved hand.

Data Suit Much less popular than DataGlove: allows to measure positions of body. typical example of use of datasuit: film of Fuji TV: the Dream of Mr. M. 3D character approximately performs same motion as animator. Another way of measuring positions of body just to use collection of sensors like Flock of Birds. However, needs algorithms for calibration and conversion (see paper by Molet et al.)

Sound Midi-equipment and workstation audio for sound generation and effects, filter processors and 3D-audio cards for spatial audio. Two categories of sound in VR can be identified: –Simulation of real world acoustics: based on our experiences in everyday life physical behavior of sound can be modeled. –comprises sound generation, e.g. caused by object collision, sound propagation and auralization. Immersive user interfaces can be used to evaluate simulation results. –Sound at user interface: sound can be applied to support user in current task or to provide information about invisible proceedings.

Presence Presence is a state of consciousness where the human actor has a sense of being in the location specified by the displays. We take presence as the central feature of "virtual reality": "A virtual reality is defined as a real or simulated environment in which a perceiver experiences telepresence" (Steuer). The unique feature of "virtual reality" systems is that they are general purpose presence transforming machines..

Meaning of Presence Presence is the psychological sense of being there in the environment specified by the displays. a high degree of presence in the VE should lead to the participant experiencing objects and processes in the virtual world as (temporarily) more the presenting reality than the real world in which the VE experience is actually embedded. A correlate of this is that the participant should exhibit behaviours that are the same as those they would carry out in similar circumstances in everyday reality. The VE experience - should be more like visiting a place, rather than like seeing images designating a place

Design in Immersive VEs With design in immersive virtual environments... designer shares same space as objects; a degree of evaluation can take place in the virtual space; presence leads to the designer behaving in a manner appropriate to everyday reality in similar circumstances. Special "interactive techniques" and behaviours do not have to be learned...

Feedback Two forms of feedbaack –Force Feedback Manipulating virtual objects Gravity Simulation –Touch (tactile) Feedback Texture appreciation Navigation Sensitive Use Haptic Devices

What is a haptic interface? A haptic interface is a force reflecting device which allows a user to touch, feel, manipulate, create, and/or alter simulated 3D-objects in a virtual environment. Movement trackers do not provide feedback

Tactile Feedback

Usage It could be used to train physical skills such as those jobs requiring specialized hand-help tools (e.g. surgeons, astronauts, mechanics), to provide haptic-feedback modeling of three dimensional objects without a physical medium (such as automobile body designers working with clay models), or to mock-up developmental prototypes directly from CAD databases (rather than in a machine shop).

Phantom Very common haptic device mainly used with augmentation on desktop systems

Exoskeleton

Actuators Electrical current drives actuators controlling individual joints Directly to motors or solenoids To valves controlling flow of fluids to hydraulic or pneumatic systemsvalves

Presence in Multi-participant Environments Sense of being in a place sense of sharing the same space as other individuals Sense of belonging to a totality more than just the sum of the individuals Awareness may be an important factor enhancing shared presence. Shared presence may correspondingly enhance awareness

Robot arm Simplest sort of robot Typical arm has 7 segments, 6 joints 6DOF Human arm 7DOF Usually driven by Step Motors Main use is in manufacturing

Robot Arm Fitted with end effector Usually interchangeable Artificial Hand, paint gun, welding rod Pressure sensor needed to prevent crushing Programmed by incremental steps which are then replicated ad infinitum

Frameworks, Chains (or Skeletons) A lot of mechanical objects in the real world consist of solid sections connected by joints Obviously robot arm but also –Creatures such as humans and animals. –Car Suspension –Ropes, string and Chains

Frameworks, Chains (or Skeletons) Sections and joints of robot arm are known as a 'chain‘ In creatures could be referred to as a skeleton Moveable sections correspond to bones Attachments between bones are joints.

Frameworks, Chains (or Skeletons) Motions of chains can be specified in terms of translations and rotations. Forward Kinematics - From the amounts of rotation and bending of each joint in an arm, for example, the position of the hand can be calculated. Inverse Kinematics - If the hand is moved, the rotation and bending of the arm is calculated, in accordance with the length and joint properties of each section of the arm.

Joint Translation-Rotation We can use a transform (T) to transform each point relative to the body to a position in world coordinates. If we want to model both linear and angular (rotational) motion then we need to use a 4x4 matrix to represent the transform

What is Inverse Kinematics? Forward Kinematics Base End Effector ?

What is Inverse Kinematics? Inverse Kinematics Base End Effector

Kinematic Chains Solid links connected at movable joints Fixed end: base Movable end: tip or end effector One degree of freedom (DOF) per joint Open chain: one fixed end, one movable end Closed chain: both ends fixed

Forward and Inverse Kinematics

Kinematic Redundancy End-effector has 6 DoFs - (x, y, z) position - (,, ) orientation Non-redundant linkage has < = 6 joints (DoFs) Redundant linkage has > 6 joints (DoFs) - Human arm has 7 DoFs » Shoulder 3 » Elbow 1 » Forearm 1 » Wrist 2 - Redundancy enables multiple solutions

Inverse Kinematics (IK) Non-redundant Linkages - Analytical solutions Redundant Linkages - Many techniques » Pseudo-inverse (Jacobian) » Gradient » Others IK Commonly Found in Animation Packages - 3D Studio Max

Redundancy A redundant system has infinite number of solutions Human skeleton has 70 DOF –Ultra-super redundant How to solve highly redundant system?

Iterative solution Start at end effector Move each joint so that end gets closer to target The angle of rotation for each joint is found by taking the dot product of the vectors from the joint to the current point and from the joint to the desired end point. Then taking the arcsin of this dot product. To find the sign of this angle (ie which direction to turn), take the cross product of these vectors and checking the sign of the Z element of the vector.

Goal Potential Function “Distance” from the end effector to the goal Function of joint angles : G(  )

Our Example Base End Effector Goal distance

Quiz Will G(  ) be always zero? –No : Unreachable Workspace Will the solution be always found? –No : Local Minima/Singular Configuration Will the solution be always unique? –No : Redundancy

Conflict Between Goals base ee 2 ee 1

Conflict Between Goals base ee 2 ee 1 Goal 1

Conflict Between Goals base ee 2 Goal 2 ee 1

Conflict Between Goals base ee 2 ee 1 Goal 1 Goal 2

Conflict Between Goals base ee 2 ee 1 Goal 1 Goal 2

Figure Modeling Many VE Applications Require Human, Animalor Robotic Actors - Team training exercises » SIMNET, DIS - Mission planning and rehearsal - Human factors studies » Boeing Walkthroughs Virtual Actors - Computational models of real-world counterparts

Virtual Actors: Autonomous or Guided Guided Actors are Slaved to the Motions of a Human Participant Using Body Tracking – Optical, mechanical,... – A.K.A. Avatar Autonomous Actors Are Controlled by Behavior Modeling Programs, and Can - Augment or replace human participants - Serve as surrogate instructors - Act as guides in complex synthetic worlds Hybrid Control Desirable - VRLOCO uses interaction to invoke and control locomotion behaviors

The Weiss 6-Level Motor Organization Hierarchy Organism Level 6. Motor Behavior 5. Motor Organ System 4. Motor Organ 3. Muscle Group 2. Muscle 1. Motor Unit Neuron Level 3. Muscle Group - Coordinated action of several muscles - Motion at one joint 2. Muscle - Muscle contraction 1. Motor Unit - Neuron + muscle fibers - Twitching, shivering

The Weiss 6-Level Motor Organization Hierarchy Organism Level 6. Motor Behavior 5. Motor Organ System 4. Motor Organ 3. Muscle Group 2. Muscle 1. Motor Unit Neuron Level 6. Motor Behavior - Movement of the whole organism - E.G., Goal-directed locomotion - Task manager 5. Motor Organ System - Coordinated action of several limbs - E.g., Walking - Motor programs, skills 4. Motor Organ - Coordinated action of several joints - E.G., Stepping motion of a limb - Local motor programs

Motion and Reaction Sensorymotor level - Levels Peripheral and proprioceptive feedback associated with reflex arcs - Motor programs and reflexes coordinate and control motion - Executes behaviors Reactive level – Level 6 and higher - Perception triggers and modulates behavior - Organism responds to environmental stimuli to select and compose behaviors - Selects behaviors

Organization of a Virtual Actor Organism Level 6. Motor Behavior 5. Motor Organ System 4. Motor Organ 3. Muscle Group 2. Muscle 1. Motor Unit Neuron Level Level 6 and above Reactive level Levels 1-5 Sensorymotor level

Virtual Actor

Abstraction and Interaction

Representation and Abstraction

Finite State Machines for Walking

Control and Abstraction

Avatars

Static Balance

Weight Bend: – Non-weight-bearing motion – Traverse subtree rooted at rotating joint Pivot – Weight-bearing motion – Traverse entire tree starting at root EXCEPT for subtree rooted at rotating joint Critical Element of Realism – Is the character supported by its legs, or are the legs dangling in space as the character is translated along ?

Non-weight-bearing motion – traverse subtree rooted at rotating joint Bend

Weight-bearing motion – traverse entire tree starting at root EXCEPT for subtree rooted at rotating joint Pivot

Gait Parameters Gait Pattern – Sequence of lifting and placing feet Gait Cycle – One repetition Of the gait pattern Period – Duration of one gait cycle Relative Phase of Leg I – Fraction of gait cycle before leg I is lifted Duty Factor – Fraction of gait cycle period a given leg spends on ground Swing Time – Time a leg spends In the air Stance Time – Time a leg spends On the ground Stroke – Distance body travels during a leg's stance time

Finite State Machines for Walking

Tele-Immersion Goal - not just making these collaborations possible, but making them convenient

CAVERNsoft Application Bryan Carter, Bill Plummer – ATC (Advanced Technology Center at Univ of Missouri- Columbia ) SIGGRAPH 1999 Harlem is reconstructed for an African American Literature course at MU. Instead of just reading literary works from this era, this prototype will allow students to become immersed and engaged in an interactive literature course. Jim Sosnoski, Jim Fletcher- English Dept. Univ Illinois Chicago Steve Jones- Communications Dept. Univ Illinois Chicago Virtual Harlem

Elements of Tele-Immersion

Avatars Tracking head and hand position and orientation give good cues Extendable pointing rays can be useful in large spaces Exaggerated head and hand motions give better cues than just hand

Shared Virtual Environments in Europe Collaborative Virtual Environments (COVEN) ACTS Develops an integrated teleworking platform that supports multi-sensory presence for collaboration in shared virtual environments. Services: mechanisms to support the presence of users in shared virtual environments. browsing and interaction facilities for large numbers of users accessing enormous quantities of remote information; synchronised multi-sensory interaction with dynamic representations of three-dimensional objects and actors; support for collaborative tasks requiring complex motor skills and shared information.

VR Applications Augmented Reality –Placing data in the normal workspace Data Visualisation –Explaining data through better representation Training –For dangerous/expense procedures Conferencing –Social context for telecommunication Health –Treatment of phobias/psychological disorders Entertainment