Presentation is loading. Please wait.

Presentation is loading. Please wait.

UCL Human Representation in Immersive Space. UCL Human Representation in Immersive Space Body ChatSensing Z X Y Zr YrXr Real–Time Animation.

Similar presentations


Presentation on theme: "UCL Human Representation in Immersive Space. UCL Human Representation in Immersive Space Body ChatSensing Z X Y Zr YrXr Real–Time Animation."— Presentation transcript:

1 UCL Human Representation in Immersive Space

2 UCL Human Representation in Immersive Space Body ChatSensing Z X Y Zr YrXr Real–Time Animation

3 UCL Human Representation in Immersive Space BODY CHAT BODY CHAT SENSING REAL-TIME ANIMATION Aim:To optimise human behaviour in virtual communication Environment:Network chat environments

4 UCL Human Representation in Immersive Space BODY CHAT Description of the system BODY CHAT SENSING REAL-TIME ANIMATION

5 UCL Human Representation in Immersive Space BODY CHAT Description of the system CLIENT A A CLIENT B BODY CHAT SENSING REAL-TIME ANIMATION

6 UCL Human Representation in Immersive Space BODY CHAT Avatar behaviour Presence and Movement:Dynamic creation of an avatar if the user logs on and removal if user logs off. Avatars identify particular user’s presence in the virtual environment and pinpoints his or her location. Avatar and Shadow avatar directly react to the keyboard input of forward, backwards, left and right. Signs of life:Automated breathing and eye blinking of the avatar, also some randomness to prevent synchrony Communication:Typed text. Conversational Phenomena Communicative Behaviours BODY CHAT SENSING REAL-TIME ANIMATION

7 UCL Human Representation in Immersive Space BODY CHAT Avatar behaviour CommunicationConversational Phenomena Communicative Behaviours Salutation phenomenon is associated with: Looking Head tossing Waving Smiling High level user control BODY CHAT SENSING REAL-TIME ANIMATION

8 UCL Human Representation in Immersive Space BODY CHAT SENSING REAL-TIME ANIMATION BODY CHAT Awareness of environment:

9 UCL Human Representation in Immersive Space BODY CHAT SENSING REAL-TIME ANIMATION BODY CHAT Sample interaction:

10 UCL Human Representation in Immersive Space BODY CHAT Sample for behaviour initiated through typing: BODY CHAT SENSING REAL-TIME ANIMATION

11 UCL Human Representation in Immersive Space BODY CHAT SENSING REAL-TIME ANIMATION BODY CHAT Objective:To understand the avatar as an autonomous agent with semi-autonomous animations Statements:The performance of human features are important not the appearance of the features themselves. Macromanaging instead of Micromanaging avatar behaviour

12 6 DOF Sensor (“Flock of Birds” from Ascension Technology, Inc.) Z XY Zr Yr Xr Real-Time Control of a Virtual Human Using Minimal Sensors Their goal is to realistically recreate human postures while minimally encumbering the operator UCL Human Representation in Immersive Space BODY CHAT SENSING REAL-TIME ANIMATION

13 Real-Time Control of a Virtual Human Using Minimal Sensors Z XY Zr YrXr S 2 Z XY Zr YrXr S 3 Z XY Zr YrXr S 2 Z XY Zr YrXr S 3 Z XY Zr YrXr S 1 Z XY Zr YrXr S 4 Z XY Zr YrXr S 1 Z XY Zr YrXr S 4 Placement of four 6 DOF position sensors to track body position UCL Human Representation in Immersive Space BODY CHAT SENSING REAL-TIME ANIMATION

14 Z XY Zr YrXr S 2 Z XY Zr YrXr S 3 Z XY Zr YrXr S 1 Sensors tied most closely to view cone and effector (hand) positions Real-Time Control of a Virtual Human Using Minimal Sensors UCL Human Representation in Immersive Space BODY CHAT SENSING REAL-TIME ANIMATION

15 Real-Time Control of a Virtual Human Using Minimal Sensors Uses an inverse kinematics algorithm to infer from the sensor position and orientation data a human posture representation. The software needs to make assumptions about parameters constraining the representation. This is to overcome the absence of information from all the un- sensed joints e.g. elbows, knees and feet etc. The processing time taken to compute this turned out to be the largest portion of latency in each frame update. Software Processing UCL Human Representation in Immersive Space BODY CHAT SENSING REAL-TIME ANIMATION

16 Application and Limitations No finger/grip sensing for manipulating objects, non-intuitive protocols such as bring hands together to an intersection to represent “object selection” etc. Calibration is required to suit the user, this was based on average proportions and might misrepresent non-average persons. Range of approximately 3.0m in a hemisphere around the transmitter. Real-Time Control of a Virtual Human Using Minimal Sensors UCL Human Representation in Immersive Space BODY CHAT SENSING REAL-TIME ANIMATION

17 UCL Human Representation in Immersive Space BODY CHAT SENSING REAL-TIME ANIMATION Real Time Animation of Realistic Virtual Humans

18 UCL Human Representation in Immersive Space BODY CHAT SENSING REAL-TIME ANIMATION ENVIRONMENT: Games, Interactive TV, Cave, Immersive Environments CLAIM: Simulating humans in real-time adds to the sense of presence. OBJECTIVE: To develop a system capable of animating humans in real-time (With high-realism graphics and movement) CyberTennis. The virtual tennis players animated by real-time motion capture Real Time Animation of Realistic Virtual Humans

19 UCL Human Representation in Immersive Space APPROACH 3-part process Modelling of figure Deformation of figure in real-time Motion Control MODELLING Divided body into head, hands and body as each part has different requirements for modelling Sculptor software to model head and hands based on prototype examples. Head created from template in Sculptor software. BODY CHAT SENSING REAL-TIME ANIMATION

20 UCL Human Representation in Immersive Space BodyBuilder software to model the body. Took a multi-layered approach to design of bodies 1)Articulated skeleton : Body proportions are designed at this stage. 2)Joints : Metaballs or Ellipsoids or Grouped volume primitives. Simulates muscle shape and behaviour. Attached to skeleton and can be transformed interactively. 3)Body envelope : Equivalent to human skin. Uses spline surfaces. 4)Texture fitting or mapping : Adds detail to model e.g. skin or hair. Requires correlation between model and image. Developed new software to do this BODY CHAT SENSING REAL-TIME ANIMATION

21 UCL Human Representation in Immersive Space Animation Animate skeleton and the top layers are automatically computed and move with it. Skeleton Each joint has defined degrees of freedom and rotation. Animating the joint anglesover time animates the skeleton. Skin Deformation : Compromise between realism and computing speed. Constructing a body mesh: Body data is output as cross-sectional contours. Triangle meshes are produced for each part Allows manipulation of skin contours. Transforms 3D coordinates into 2D plane Hands : Modelled in similar way. Facial Animation : Based on pseudo muscle design. Considers skin surface as polygonal mesh Basic motion parameters are defined as minimum perceptible actions. These MPA’s define both the facial expressions and the animated shape of the face. Used control lattice to control face geometry. Result from three types of input – video, audio or speech and pre-defined actions. BODY CHAT SENSING REAL-TIME ANIMATION

22 UCL Human Representation in Immersive Space Skeleton motion control : Possible approaches 1)Skeleton motion captured in real-time drives a pure avatar: Movements exactly reflect those of real person Information collected via tracking sensors and used in combination with human animation knowledge. 2)Skeleton motion is selectively activated from a database of predefined motions: i.e. avatar is controlled by user in real-time, but avatar movements do not correspond to those of user. Can use motion capture to generate animated sequences which reduces design time 3)Skeleton animation is dynamically calculated. An autonomous actor may act without the users intervention Behaviour relies on perception of environment Should use visual, auditory and tactile senses and adapt behavour according to information received. BODY CHAT SENSING REAL-TIME ANIMATION

23 Body ChatSensing Z X Y Zr YrXr Real–Time Animation How realistic does the graphical representation have to map in the virtual embodiment for a given use context? To what extend does the virtual embodiment have to have an accurate mapping of the real as opposed to a synthetic caricature for a given use? UCL Human Representation in Immersive Space


Download ppt "UCL Human Representation in Immersive Space. UCL Human Representation in Immersive Space Body ChatSensing Z X Y Zr YrXr Real–Time Animation."

Similar presentations


Ads by Google