Facial animation retargeting framework using radial basis functions Tamás Umenhoffer, Balázs Tóth Introduction Realistic facial animation16 is a challenging.

Slides:



Advertisements
Similar presentations
Active Appearance Models
Advertisements

1.1 Designed and Presented by Dr. Ayman Elshenawy Elsefy Dept. of Systems & Computer Eng.. Al-Azhar University
Real-Time Dynamic Wrinkles Caroline Larboulette Marie-Paule Cani GRAVIR Lab, Grenoble, France.
Presented by Nikhil Mohan Narsapur 1ve07ec069.  Hawk-Eye is a used to track the path of the ball.  Hawk-Eye is a used to track the path of the ball.
Game Content Development (Game Content Development)
Character Setup Character Setup is the process of creating handles and controls for anything that a character animator will need to adjust in order to.
Chapter 4: IMD Chapter 4: Character Animation Character Animation 1 Lecturer: Norhayati Mohd Amin.
Capturing Facial Details by Space- time Shape-from-shading Yung-Sheng Lo *, I-Chen Lin *, Wen-Xing Zhang *, Wen-Chih Tai †, Shian-Jun Chiou † CAIG Lab,
Motion Capture CS294-7 Jacqueline Takeshita Mindy Lue.
Advanced Computer Graphics (Fall 2010) CS 283, Lecture 24: Motion Capture Ravi Ramamoorthi Most slides courtesy.
LYU0603 A Generic Real-Time Facial Expression Modelling System Supervisor: Prof. Michael R. Lyu Group Member: Cheung Ka Shun ( ) Wong Chi Kin ( )
RECOGNIZING FACIAL EXPRESSIONS THROUGH TRACKING Salih Burak Gokturk.
Knowledge Acquisitioning. Definition The transfer and transformation of potential problem solving expertise from some knowledge source to a program.
Real-Time Geometric and Color Calibration for Multi-Projector Displays Christopher Larson, Aditi Majumder Large-Area High Resolution Displays Motivation.
BPC: Art and Computation – Spring 2007 Overview of Spring Semester Tools and Technologies Glenn Bresnahan
1 Expression Cloning Jung-yong Noh Ulrich Neumann Siggraph01.
Vision-based Control of 3D Facial Animation Jin-xiang Chai Jing Xiao Jessica Hodgins Carnegie Mellon University.
Efficient 3D Data Representation for Biometric Applications Hassan Ugail and Eyad Elyan School of Informatics University of Bradford United Kingdom.
Human Computer Interaction 7. Advanced User Interfaces (I) Data Scattering and RBF Course no. ILE5013 National Chiao Tung Univ, Taiwan By: I-Chen Lin,
Optical Motion Capture Bobby Bruckart Ben Heipp James Martin Molly Shelestak.
RIGS & MOTION CAPTURE By: Jennifer Marcial and Juan m. lopez A presentation on animation film-making, and how it works.
Sunee Holland University of South Australia School of Computer and Information Science Supervisor: Dr G Stewart Von Itzstein.
University of Texas at Austin CS 378 – Game Technology Don Fussell CS 378: Computer Game Technology Beyond Meshes Spring 2012.
Curve Modeling Bézier Curves
Gesture Recognition Using Laser-Based Tracking System Stéphane Perrin, Alvaro Cassinelli and Masatoshi Ishikawa Ishikawa Namiki Laboratory UNIVERSITY OF.
Lesson 1: Intro to Animation
Computer Graphics: Programming, Problem Solving, and Visual Communication Steve Cunningham California State University Stanislaus and Grinnell College.
Motion Blending (Multidimensional Interpolation) Jehee Lee.
Multimedia Specification Design and Production 2013 / Semester 2 / week 8 Lecturer: Dr. Nikos Gazepidis
STC Robot 2 Majd Srour, Anis Abboud Under the supervision of: Yotam Elor and Prof. Alfred Bruckstein Optimally Covering an Unknown Environment with Ant-like.
Use and Re-use of Facial Motion Capture M. Sanchez, J. Edge, S. King and S. Maddock.
Zhengyou Zhang Microsoft Research Digital Object Identifier: /MMUL Publication Year: 2012, Page(s): Professor: Yih-Ran Sheu Student.
3D COMPUTER GRAPHICS IMD Chapter 1: 3D Computer Graphics Chapter 1: 1 Lecturer: Norhayati Mohd Amin.
Graduate Programs in Computer Science A Soft Hand Model for Physically-based Manipulation of Virtual Objects Jan Jacobs Group Research.
09/09/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Event management Lag Group assignment has happened, like it or not.
Computing Fundamentals Module Lesson 19 — Using Technology to Solve Problems Computer Literacy BASICS.
Passage Three Multimedia Application. Training target: In this part , you should try your best to form good reading habits. In order to avoid your ill.
Cohesive Design of Personalized Web Applications Presented by Yinghua Hu Schwabe, D. Mattos Guimaraes, R. Rossi, G. Pontificia Univ. Catolica do Rio de.
Computer Graphics 2 In the name of God. Outline Introduction Animation The most important senior groups Animation techniques Summary Walking, running,…examples.
Using GPU Rendering - Achieving Animation Production in Hours 王銓彰 Media Lab, Next Media Animation October 22, 2010.
110/20/ :06 Graphics II Paper Reviews Facial Animation Session 8.
N n Debanga Raj Neog, Anurag Ranjan, João L. Cardoso, Dinesh K. Pai Sensorimotor Systems Lab, Department of Computer Science The University of British.
Presented by Matthew Cook INFO410 & INFO350 S INFORMATION SCIENCE Paper Discussion: Dynamic 3D Avatar Creation from Hand-held Video Input Paper Discussion:
I Robot.
Computing Fundamentals Module Lesson 6 — Using Technology to Solve Problems Computer Literacy BASICS.
Computer Graphics: Programming, Problem Solving, and Visual Communication Steve Cunningham California State University Stanislaus and Grinnell College.
Animated Speech Therapist for Individuals with Parkinson Disease Supported by the Coleman Institute for Cognitive Disabilities J. Yan, L. Ramig and R.
1 Developing a 2.5-D video avatar Tamagawa, K.; Yamada, T.; Ogi, T.; Hirose, M.; Signal Processing Magazine, IEEE Volume 18, Issue 3, May 2001 Page(s):35.
UNC Chapel Hill M. C. Lin Basics of Motion Generation let X i = position,orient. of O i at t k = t 0,  i END = false while (not END) do display O i, 
Hand Gesture Recognition Using Haar-Like Features and a Stochastic Context-Free Grammar IEEE 高裕凱 陳思安.
Team Members Ming-Chun Chang Lungisa Matshoba Steven Preston Supervisors Dr James Gain Dr Patrick Marais.
University of Washington v The Hebrew University * Microsoft Research Synthesizing Realistic Facial Expressions from Photographs Frederic Pighin Jamie.
Facial Motion Cloning Using Global Shape Deformation Marco Fratarcangeli and Marco Schaerf University of Rome “La Sapienza”
Facial Animation Wilson Chang Paul Salmon April 9, 1999 Computer Animation University of Wisconsin-Madison.
Animation Animation is about bringing things to life Technically: –Generate a sequence of images that, when played one after the other, make things move.
+ Game Design Careers. + Game Development Developing an interesting video game is a very challenging task. It usually takes many different people working.
3D Animation 1. Introduction Dr. Ashraf Y. Maghari Information Technology Islamic University of Gaza Ref. Book: The Art of Maya.
Introduction to Parametric Curve and Surface Modeling.
Acquiring, Stitching and Blending Diffuse Appearance Attributes on 3D Models C. Rocchini, P. Cignoni, C. Montani, R. Scopigno Istituto Scienza e Tecnologia.
Computer Animation Ying Zhu Georgia State University
Introduction to Graphics Modeling
CAPTURING OF MOVEMENT DURING MUSIC PERFORMANCE
Categorizing sex and identity from the biological motion of faces
Physical Face Rigging Dinghuang Ji
Basics of Motion Generation
Chapter I Introduction
UMBC Graphics for Games
WELCOME.
Computer Graphics Lecture 15.
– Graphics and Visualization
Presentation transcript:

Facial animation retargeting framework using radial basis functions Tamás Umenhoffer, Balázs Tóth Introduction Realistic facial animation16 is a challenging task. Facial gestures and signs are one of the most significant ways of human communication. Any little mistake in the animation can make it unrealistic and unbelievable. Facial animation is crucial in the movie industry, but it also plays a great role in virtual reality applications and computer games. However realtime and offline systems may require different approaches. We investigate both realtime and offline possibilities.Our goal is to suit the retargeting method to the needs and possibilities of the different systems.We’ve investigated methods that does not limit the animators in choosing their rigging tools, and also considered the possibilities of common realtime graphics engines. We suggest two different techniques for realtime and offline environment, both based on Radial Basis Functions, but they are used differently. Once as a geometry mapping for realtime, and once as an expression mapping scattered data interpolation method for production environments. We also show how the geometry mapping method can be built into the production pipeline to enable the training of the system without the knowledge of the facial geometry of the actor who will later control the virtual character. Method Retargeting is the process where we adjust the target rig parameters according to the captured performance data. To do this we have to find the relationship between these parameters. As the function of this relationship is not trivial we use an interpolation or approximation scheme based on sample source and target configurations, where the samples are not placed evenly but at intuitive configurations.We have chosen Radial Basis Function sets as they were successfully used previously in the area of facial animation retargeting. The source geometry is defined by the face of the user, and the sample points are tracked feature points. We used a realtime tracker library that tracks 3D positions of these feature points from realtime video, typically from a web camera. The target geometry is the virtual model. However the virtual model has much higher resolution geometry than the source geometry, thus we need to move the vertices which lie between feature points. To efficiently achieve this we set up a bone system for the virtual face model, where each source feature point has its corresponding bone. Bone weights should be painted accurately by animators. Figure 2 (a) shows the target model and the bone positions marked by green dots. Figure 2 (b) shows the tracked feature points of the actual user. It is clearly visible that the virtual model and the user’s face has different facial geometric features. If we used the tracked positions as are, without retargeting we would get an incorrect deformation shown on figure2 (d) With the geometry mapping method described above the target model keeps its own features but also inherits the source face’s motion (figure 2 (c)). Figure 2: Geometry mapping. Green dots show the position of the bones that deform the face geometry (a). Red dots shows the corresponding marker positions tracked for the actual user (b), the geometric features of the two models have significant differences. Using geometry mapping the user’s smile appears on the target mesh believably (c). Without geometry mapping target geometry would suffer from incorrect deformations (d). Results Realtime expression retargeting in Maya. The animation of the face model was driven by a motion capture device using realtime tracking of a web camera video Conclusions We presented two facial animation retargeting methods, one for realtime use and one for higher quality animations. Both techniques used radial basis function sets to solve a scattered data interpolation problem. Their main difference is the complexity of mapping. Our realtime method uses geometry mapping to instantly map a 3D point from the source geometry’s surface onto the target geometry’s surface. On the other hand expression mapping method maps complex configurations of feature points onto complex configurations of rig parameters. We have shown RBF regression is a very flexible solution to facial animation. Depending on the needs one can control a bone system, blend shape weights or any other parameter sets, and can train the system for generic use or for a particular shot or performer too.