UCL Human Representation in Immersive Space. UCL Human Representation in Immersive Space Body ChatSensing Z X Y Zr YrXr Real–Time Animation.

Slides:



Advertisements
Similar presentations
SEMINAR ON VIRTUAL REALITY 25-Mar-17
Advertisements

Digital Interactive Entertainment Dr. Yangsheng Wang Professor of Institute of Automation Chinese Academy of Sciences
Computer Animation Rick Parent Computer Animation Algorithms and Techniques Figure Animation.
3D Graphics for Game Programming (J. Han) Chapter XI Character Animation.
 INTRODUCTION  STEPS OF GESTURE RECOGNITION  TRACKING TECHNOLOGIES  SPEECH WITH GESTURE  APPLICATIONS.
1Notes  Assignment 0 marks should be ready by tonight (hand back in class on Monday)
HCI 530 : Seminar (HCI) Damian Schofield.
Advanced Computer Graphics (Fall 2010) CS 283, Lecture 24: Motion Capture Ravi Ramamoorthi Most slides courtesy.
BPC: Art and Computation – Fall 2006 Introduction to virtual environments Glenn Bresnahan
MUltimo3-D: a Testbed for Multimodel 3-D PC Presenter: Yi Shi & Saul Rodriguez March 14, 2008.
Graphics. Applications  Digital media  Entertainment  Art  Visualization  Science  Modeling  Games  Software  Virtual Reality.
Character Animation CSE 191A: Seminar on Video Game Programming Lecture 5: Character Animation UCSD, Spring, 2003 Instructor: Steve Rotenberg.
Virtual Reality. What is virtual reality? a way to visualise, manipulate, and interact with a virtual environment visualise the computer generates visual,
Animation. Outline  Key frame animation  Hierarchical animation  Inverse kinematics.
Augmented Reality: Object Tracking and Active Appearance Model
1 Transparent control of avatar gestures A prototype Francesca Barrientos GUIR Meeting  28 April 2000.
Communicating with Avatar Bodies Francesca Barrientos Computer Science UC Berkeley 8 July 1999 HCC Research Retreat.
Computer-Based Animation. ● To animate something – to bring it to life ● Animation covers all changes that have visual effects – Positon (motion dynamic)
Virtual Reality Virtual Reality involves the user entering a 3D world generated by the computer. To be immersed in a 3D VR world requires special hardware.
Computer Animation Rick Parent Computer Animation Algorithms and Techniques Motion Capture.
2.03B Common Types and Interface Devices and Systems of Virtual Reality 2.03 Explore virtual reality.
3-D Modeling Concepts V part 2.
V part 2 Obtained from a Guildford County workshop- Summer, 2014.
3D Concepts Coordinate Systems Coordinates specify points in space 3D coords commonly use X, Y, & Z A vertex is a 'corner' of an object Different coordinate.
Modeling and representation 1 – comparative review and polygon mesh models 2.1 Introduction 2.2 Polygonal representation of three-dimensional objects 2.3.
Electronic Visualization Laboratory University of Illinois at Chicago Interaction between Real and Virtual Humans: Playing Checkers R. Torre, S. Balcisoy.
Helsinki University of Technology Laboratory of Computational Engineering Modeling facial expressions for Finnish talking head Michael Frydrych, LCE,
Virtual reality. Tasks 3D digital model from planes 3D digital model of existing objects Office work Field observations Solid modeling Photogrammetry.
A FACEREADER- DRIVEN 3D EXPRESSIVE AVATAR Crystal Butler | Amsterdam 2013.
1 Lecture 19: Motion Capture. 2 Techniques Morphing Motion Capture.
Designing 3D Interfaces Examples of 3D interfaces Pros and cons of 3D interfaces Overview of 3D software and hardware Four key design issues: system performance,
Use and Re-use of Facial Motion Capture M. Sanchez, J. Edge, S. King and S. Maddock.
Facial animation retargeting framework using radial basis functions Tamás Umenhoffer, Balázs Tóth Introduction Realistic facial animation16 is a challenging.
Invitation to Computer Science 5th Edition
3D COMPUTER GRAPHICS IMD Chapter 1: 3D Computer Graphics Chapter 1: 1 Lecturer: Norhayati Mohd Amin.
Real-Time Animation of Realistic Virtual Humans. 1. The 3D virtual player is controlled by the real people who has a HMD and many sensors people who has.
Computer Graphics 2 In the name of God. Outline Introduction Animation The most important senior groups Animation techniques Summary Walking, running,…examples.
Interactive Spaces Huantian Cao Department of Computer Science The University of Georgia.
March 1, 20021ICT Virtual Human Workshop HUMAN FIGURE ANIMATION Norman I. Badler Center for Human Modeling and Simulation University of Pennsylvania Philadelphia,
GENESIS OF VIRTUAL REALITY  The term ‘Virtual reality’ (VR) was initially coined by Jaron Lanier, founder of VPL Research (1989)..
Vision-based human motion analysis: An overview Computer Vision and Image Understanding(2007)
Multimedia System and Networking UTD Slide- 1 University of Texas at Dallas B. Prabhakaran Rigging.
Lecture 6: 18/5/1435 Computer Animation(2) Lecturer/ Kawther Abas CS- 375 Graphics and Human Computer Interaction.
Computer Graphics Chapter 6 Andreas Savva. 2 Interactive Graphics Graphics provides one of the most natural means of communicating with a computer. Interactive.
Toward a Unified Scripting Language 1 Toward a Unified Scripting Language : Lessons Learned from Developing CML and AML Soft computing Laboratory Yonsei.
Character Setup In addition to rigging for character models, rigging artists are also responsible for setting up animation controls for anything that is.
2.03 Explore virtual reality design and use.
4 November 2000Bridging the Gap Workshop 1 Control of avatar gestures Francesca Barrientos Computer Science Division UC Berkeley.
Animated Speech Therapist for Individuals with Parkinson Disease Supported by the Coleman Institute for Cognitive Disabilities J. Yan, L. Ramig and R.
Subject Name: Computer Graphics Subject Code: Textbook: “Computer Graphics”, C Version By Hearn and Baker Credits: 6 1.
12/24/2015 A.Aruna/Assistant professor/IT/SNSCE 1.
1cs426-winter-2008 Notes. 2 Kinematics  The study of how things move  Usually boils down to describing the motion of articulated rigid figures Things.
What is Multimedia Anyway? David Millard and Paul Lewis.
Computer Animation Algorithms and Techniques
3-D Modeling Concepts V part 2.
Physically-Based Motion Synthesis in Computer Graphics
Character Animation Forward and Inverse Kinematics
3-D Modeling Concepts V part 2.
CAPTURING OF MOVEMENT DURING MUSIC PERFORMANCE
3.03 Explore virtual reality design and use.
Categorizing sex and identity from the biological motion of faces
Fusion, Face, HD Face Matthew Simari | Program Manager, Kinect Team
Real-time Skeletal Skinning with Optimized Centers of Rotation
Prepared by: Engr . Syed Atir Iftikhar
UMBC Graphics for Games
Synthesis of Motion from Simple Animations
3-D Modeling Concepts V part 2.
Computer Graphics Lecture 15.
Interactive media.
Lecture 3. Virtual Worlds : Representation,Creation and Simulation ( II ) 고려대학교 그래픽스 연구실.
Presentation transcript:

UCL Human Representation in Immersive Space

UCL Human Representation in Immersive Space Body ChatSensing Z X Y Zr YrXr Real–Time Animation

UCL Human Representation in Immersive Space BODY CHAT BODY CHAT SENSING REAL-TIME ANIMATION Aim:To optimise human behaviour in virtual communication Environment:Network chat environments

UCL Human Representation in Immersive Space BODY CHAT Description of the system BODY CHAT SENSING REAL-TIME ANIMATION

UCL Human Representation in Immersive Space BODY CHAT Description of the system CLIENT A A CLIENT B BODY CHAT SENSING REAL-TIME ANIMATION

UCL Human Representation in Immersive Space BODY CHAT Avatar behaviour Presence and Movement:Dynamic creation of an avatar if the user logs on and removal if user logs off. Avatars identify particular user’s presence in the virtual environment and pinpoints his or her location. Avatar and Shadow avatar directly react to the keyboard input of forward, backwards, left and right. Signs of life:Automated breathing and eye blinking of the avatar, also some randomness to prevent synchrony Communication:Typed text. Conversational Phenomena Communicative Behaviours BODY CHAT SENSING REAL-TIME ANIMATION

UCL Human Representation in Immersive Space BODY CHAT Avatar behaviour CommunicationConversational Phenomena Communicative Behaviours Salutation phenomenon is associated with: Looking Head tossing Waving Smiling High level user control BODY CHAT SENSING REAL-TIME ANIMATION

UCL Human Representation in Immersive Space BODY CHAT SENSING REAL-TIME ANIMATION BODY CHAT Awareness of environment:

UCL Human Representation in Immersive Space BODY CHAT SENSING REAL-TIME ANIMATION BODY CHAT Sample interaction:

UCL Human Representation in Immersive Space BODY CHAT Sample for behaviour initiated through typing: BODY CHAT SENSING REAL-TIME ANIMATION

UCL Human Representation in Immersive Space BODY CHAT SENSING REAL-TIME ANIMATION BODY CHAT Objective:To understand the avatar as an autonomous agent with semi-autonomous animations Statements:The performance of human features are important not the appearance of the features themselves. Macromanaging instead of Micromanaging avatar behaviour

6 DOF Sensor (“Flock of Birds” from Ascension Technology, Inc.) Z XY Zr Yr Xr Real-Time Control of a Virtual Human Using Minimal Sensors Their goal is to realistically recreate human postures while minimally encumbering the operator UCL Human Representation in Immersive Space BODY CHAT SENSING REAL-TIME ANIMATION

Real-Time Control of a Virtual Human Using Minimal Sensors Z XY Zr YrXr S 2 Z XY Zr YrXr S 3 Z XY Zr YrXr S 2 Z XY Zr YrXr S 3 Z XY Zr YrXr S 1 Z XY Zr YrXr S 4 Z XY Zr YrXr S 1 Z XY Zr YrXr S 4 Placement of four 6 DOF position sensors to track body position UCL Human Representation in Immersive Space BODY CHAT SENSING REAL-TIME ANIMATION

Z XY Zr YrXr S 2 Z XY Zr YrXr S 3 Z XY Zr YrXr S 1 Sensors tied most closely to view cone and effector (hand) positions Real-Time Control of a Virtual Human Using Minimal Sensors UCL Human Representation in Immersive Space BODY CHAT SENSING REAL-TIME ANIMATION

Real-Time Control of a Virtual Human Using Minimal Sensors Uses an inverse kinematics algorithm to infer from the sensor position and orientation data a human posture representation. The software needs to make assumptions about parameters constraining the representation. This is to overcome the absence of information from all the un- sensed joints e.g. elbows, knees and feet etc. The processing time taken to compute this turned out to be the largest portion of latency in each frame update. Software Processing UCL Human Representation in Immersive Space BODY CHAT SENSING REAL-TIME ANIMATION

Application and Limitations No finger/grip sensing for manipulating objects, non-intuitive protocols such as bring hands together to an intersection to represent “object selection” etc. Calibration is required to suit the user, this was based on average proportions and might misrepresent non-average persons. Range of approximately 3.0m in a hemisphere around the transmitter. Real-Time Control of a Virtual Human Using Minimal Sensors UCL Human Representation in Immersive Space BODY CHAT SENSING REAL-TIME ANIMATION

UCL Human Representation in Immersive Space BODY CHAT SENSING REAL-TIME ANIMATION Real Time Animation of Realistic Virtual Humans

UCL Human Representation in Immersive Space BODY CHAT SENSING REAL-TIME ANIMATION ENVIRONMENT: Games, Interactive TV, Cave, Immersive Environments CLAIM: Simulating humans in real-time adds to the sense of presence. OBJECTIVE: To develop a system capable of animating humans in real-time (With high-realism graphics and movement) CyberTennis. The virtual tennis players animated by real-time motion capture Real Time Animation of Realistic Virtual Humans

UCL Human Representation in Immersive Space APPROACH 3-part process Modelling of figure Deformation of figure in real-time Motion Control MODELLING Divided body into head, hands and body as each part has different requirements for modelling Sculptor software to model head and hands based on prototype examples. Head created from template in Sculptor software. BODY CHAT SENSING REAL-TIME ANIMATION

UCL Human Representation in Immersive Space BodyBuilder software to model the body. Took a multi-layered approach to design of bodies 1)Articulated skeleton : Body proportions are designed at this stage. 2)Joints : Metaballs or Ellipsoids or Grouped volume primitives. Simulates muscle shape and behaviour. Attached to skeleton and can be transformed interactively. 3)Body envelope : Equivalent to human skin. Uses spline surfaces. 4)Texture fitting or mapping : Adds detail to model e.g. skin or hair. Requires correlation between model and image. Developed new software to do this BODY CHAT SENSING REAL-TIME ANIMATION

UCL Human Representation in Immersive Space Animation Animate skeleton and the top layers are automatically computed and move with it. Skeleton Each joint has defined degrees of freedom and rotation. Animating the joint anglesover time animates the skeleton. Skin Deformation : Compromise between realism and computing speed. Constructing a body mesh: Body data is output as cross-sectional contours. Triangle meshes are produced for each part Allows manipulation of skin contours. Transforms 3D coordinates into 2D plane Hands : Modelled in similar way. Facial Animation : Based on pseudo muscle design. Considers skin surface as polygonal mesh Basic motion parameters are defined as minimum perceptible actions. These MPA’s define both the facial expressions and the animated shape of the face. Used control lattice to control face geometry. Result from three types of input – video, audio or speech and pre-defined actions. BODY CHAT SENSING REAL-TIME ANIMATION

UCL Human Representation in Immersive Space Skeleton motion control : Possible approaches 1)Skeleton motion captured in real-time drives a pure avatar: Movements exactly reflect those of real person Information collected via tracking sensors and used in combination with human animation knowledge. 2)Skeleton motion is selectively activated from a database of predefined motions: i.e. avatar is controlled by user in real-time, but avatar movements do not correspond to those of user. Can use motion capture to generate animated sequences which reduces design time 3)Skeleton animation is dynamically calculated. An autonomous actor may act without the users intervention Behaviour relies on perception of environment Should use visual, auditory and tactile senses and adapt behavour according to information received. BODY CHAT SENSING REAL-TIME ANIMATION

Body ChatSensing Z X Y Zr YrXr Real–Time Animation How realistic does the graphical representation have to map in the virtual embodiment for a given use context? To what extend does the virtual embodiment have to have an accurate mapping of the real as opposed to a synthetic caricature for a given use? UCL Human Representation in Immersive Space