Facial Tracking and Animation Project Proposal Computer System Design Spring 2004 Todd BeloteDavid Brown Brad BusseBryan Harris.

Slides:



Advertisements
Similar presentations
ARTIFICIAL PASSENGER.
Advertisements

Team:. Prepared By: Menna Hamza Mohamed Mohamed Hesham Fadl Mona Abdel Mageed El-Koussy Yasmine Shaker Abdel Hameed Supervised By: Dr. Magda Fayek.
Virtual Me. Motion Capture The process of recording movement and translating that movement onto a digital model Originally used for military tracking.
Motion Capture The process of recording movement and translating that movement onto a digital model Games Fast Animation Movies Bio Medical Analysis VR.
Virtual Me. Motion Capture (mocap) Motion capture is the process of simulating actual movement in a computer generated environment The capture subject.
Chapter 10 Video.
Gonghua Liu, Ph.D. Instructional Designer & Technologist Woodruff Health Sciences Center Library, Emory University.
KINECT REHABILITATION
Speech and Gesture Corpus From Designing to Piloting Gheida Shahrour Supervised by Prof. Martin Russell Dr Neil Cooke Electronic, Electrical and Computer.
Designing Facial Animation For Speaking Persian Language Hadi Rahimzadeh June 2005.
Page 1 SIXTH SENSE TECHNOLOGY Presented by: KIRTI AGGARWAL 2K7-MRCE-CS-035.
Motivation Application driven -- VoD, Information on Demand (WWW), education, telemedicine, videoconference, videophone Storage capacity Large capacity.
3D Face Modeling Michaël De Smet.
Acoustic Navigation for Mobile Robots Computer System Design Spring 2003.
EE442—Multimedia Networking Jane Dong California State University, Los Angeles.
Facial Tracking and Animation Todd Belote Bryan Harris David Brown Brad Busse.
May 10, 2004Facial Tracking and Animation Todd Belote Bryan Harris David Brown Brad Busse.
Electrical & Computer Engineering, ECE Faculty Advisor Wayne Burleson Team Members Chinedu Okongwu Andrew Maxwell Awais Kazi Collaborators W. Richards.
Dynamic Scalable Distributed Face Recognition System Security Framework by Konrad Rzeszutek B.S. University of New Orleans, 1999.
Video in Processing David Meredith Aalborg University.
Optical Motion Capture Bobby Bruckart Ben Heipp James Martin Molly Shelestak.
Harshita Karamchandani Placement, Masters Project and Travels…..
Introduce about sensor using in Robot NAO Department: FTI-FHO-FPT Presenter: Vu Hoang Dung.
Computer Animation Rick Parent Computer Animation Algorithms and Techniques Motion Capture.
ActEyes IP Cam Training Part 2: Basic Camera Features.
Database Construction for Speech to Lip-readable Animation Conversion Gyorgy Takacs, Attila Tihanyi, Tamas Bardi, Gergo Feldhoffer, Balint Srancsik Peter.
Sana Naghipour, Saba Naghipour Mentor: Phani Chavali Advisers: Ed Richter, Prof. Arye Nehorai.
DIVA - University of Fribourg - Switzerland Seminar presentation, jan Lawrence Michel, MSc Student Portable Meeting Recorder.
DEVELOPMENT OF AN EYE TRACKING SYSTEM FOR A TABLET Harshita Karamchandani Supervisor: David Hobbs Co-supervisor: Dr. Tom Chau (Toronto)
Field Direct, A Field Inspection Application Designed to Improve Data Integrity and Accessibility for Management Oversight Kari Ward, PMP IT Project Manager.
Multimedia Specification Design and Production 2013 / Semester 2 / week 8 Lecturer: Dr. Nikos Gazepidis
Prepared By: Menna Hamza Mohamed Mohamed Hesham Fadl Mona Abdel Mageed El-Koussy Yasmine Shaker Abdel Hameed Supervised By: Dr. Magda Fayek.
Facial animation retargeting framework using radial basis functions Tamás Umenhoffer, Balázs Tóth Introduction Realistic facial animation16 is a challenging.
PortableVision-based HCI A Hand Mouse System on Portable Devices 連矩鋒 (Burt C.F. Lien) Computer Science and Information Engineering Department National.
Zhengyou Zhang Microsoft Research Digital Object Identifier: /MMUL Publication Year: 2012, Page(s): Professor: Yih-Ran Sheu Student.
Presentation by: K.G.P.Srikanth. CONTENTS  Introduction  Components  Working  Applications.
Tablet-Based Gaze Tracker P / Tina Podrasky (ISE)Michael Krenzer (EE)Hemsley Pichardo (EE) Brad Wideman (CE)Matt Kelly (CE) Susan Farnand.
Submitted by:- Vinay kr. Gupta Computer Sci. & Engg. 4 th year.
Portable Vision-Based HCI A Real-Time Hand Mouse System on Portable Devices 連矩鋒 (Burt C.F. Lien) Department of Computer Science and Information Engineering.
Group Members: Sam Marlin, Jonathan Brown Faculty Adviser: Tom Miller.
Input By Hollee Smalley. What is Input? Input is any data or instructions entered into the memory of a computer.
An Information Fusion Approach for Multiview Feature Tracking Esra Ataer-Cansizoglu and Margrit Betke ) Image and.
Tutorial 7 Working with Multimedia. New Perspectives on HTML, XHTML, and XML, Comprehensive, 3rd Edition 2 Objectives Explore various multimedia applications.
Tutorial 7 Working with Multimedia. New Perspectives on HTML, XHTML, and XML, Comprehensive, 3rd Edition 2 Objectives Explore various multimedia applications.
Data-driven methods: Video & Texture Cs195g Computational Photography James Hays, Brown, Spring 2010 Many slides from Alexei Efros.
Biomechanical Integration of Essential Human Movement Parameters By Gideon Ariel, Alfred Finch and Ann Penny.
Computer Vision Spring ,-685 Instructor: S. Narasimhan Wean 5403 T-R 3:00pm – 4:20pm Lecture #16.
Perceptual Analysis of Talking Avatar Head Movements: A Quantitative Perspective Xiaohan Ma, Binh H. Le, and Zhigang Deng Department of Computer Science.
Projector Calibration of Interactive Multi-Resolution Display Systems 互動式多重解析度顯示系統之投影機校正 Presenter: 邱柏訊 Advisor: 洪一平 教授.
1 City With a Memory CSE 535: Mobile Computing Andreea Danielescu Andrew McCord Brandon Mechtley Shawn Nikkila.
Animated Speech Therapist for Individuals with Parkinson Disease Supported by the Coleman Institute for Cognitive Disabilities J. Yan, L. Ramig and R.
AAM based Face Tracking with Temporal Matching and Face Segmentation Mingcai Zhou 1 、 Lin Liang 2 、 Jian Sun 2 、 Yangsheng Wang 1 1 Institute of Automation.
Contents Problem and Goal Problem Research Design Analyze Design Project Management.
Spring 2007 COMP TUI 1 Computer Vision for Tangible User Interfaces.
A Prototype System for 3D Dynamic Face Data Collection by Synchronized Cameras Yuxiao Hu Hao Tang.
Multimedia Computing and Networking Jan Reduced Energy Decoding of MPEG Streams Malena Mesarina, HP Labs/UCLA CS Dept Yoshio Turner, HP Labs.
ANTI Roland Anderson - CpE Patrick Galloway - CpE Casey Miville - EE Automatic Note Taker for the Impaired Group 29.
Glencoe Introduction to Multimedia Chapter 8 Audio 1 Section 8.1 Audio in Multimedia Audio plays many roles in multimedia. Effective use in multimedia.
Product: Microsoft Kinect Team I Alex Styborski Brandon Sayre Brandon Rouhier Section 2B.
Computer Animation Rick Parent Computer Animation Algorithms and Techniques Motion Capture.
PDF Recovery Tool Fix Portable Document File Format.
Motion Capture CSE 3541 Matt Boggus.
IP Speed Dome Solution 2006-Feb-22.
Klara Nahrstedt Spring 2009
Human-Operator Monitoring System
Senior Capstone Project Gaze Tracking System
Video-based human motion recognition using 3D mocap data
Computer Graphics Lecture 15.
Automated Detection of Human Emotion
Presentation transcript:

Facial Tracking and Animation Project Proposal Computer System Design Spring 2004 Todd BeloteDavid Brown Brad BusseBryan Harris

Problem Background Speech Driven Facial Animation (PRISM) –Facial animation from processed speech Previous Research (Jablonski & Zavala) –Low cost facial motion and speech processing using facial markers –An infrared camera is used with infrared reflectors to pick up facial markers –Microphone array for audio –Winnov capture card (640x480 at 30 fps) Areas of Desired Improvement –There is no current recovery method for point loss –Feature points skewed in perspective create playback artifacts –There are only 22 feature points, which cannot fully describe a face –Initialization requires mouse-clicking the markers on the first frame –The current algorithm is costly

Problem Statement Design a low cost computer system which can decode and analyze audio/video data in real-time and save the resulting analysis to disk.

Design Objectives Analyze video at 30 fps; generate an FAP(Facial Animation Parameter) file. Continue with current audio analysis. All processing done in real-time.

Existing Solutions Lin et al, from National Taiwan University use a system with mirrors and camera to determine (x,y,z) coordinates of feature points Essa et al, from MIT Media Lab use computer recognition to analyze video in non real-time setting

Microphone Array, Camera, and Capture Card Audio sample rates range from 8 to 48 kHz Video sample rate will be 30 fps, and can be captured at 640x480 resolution Capture card brings audio and video into computer in synchrony

IBM Pupil Cam Camera designed to identify human pupils by emmitting infrared light using LEDs We will use infrared reflective markers to track facial movements

Design Constraints and Feasibility Cost and Speed –The system must run in real-time Portability –Hardware specific system Quicktime would make a more portable system Usability –Point initialization –Freedom of movement for each user –Recovery from point occlusion

Alternative Solutions Leave system as-is Use mirrors to find (x,y,z) coordinates Quicktime libraries Individual facial templates

Design Validation Can generate FAP file in real-time Lost points can be recovered User can rotate her head without data loss Audio is analyzed in real-time Automatic point initialization

Societal, Safety and Environmental Analysis Primary use as a research tool Low bandwidth face to face phones using texture maps Low bandwidth phone support for the deaf Audio-visual data recording

Management Todd Belote – Data Aquisition David Brown – Marker Initialization Brad Busse – Marker Tracking Algorithms Bryan Harris – Facial Relationships

Scheduling Mondays – 4:20 to 6:30 PM –With Steve Ortiz Wednesdays – 5:30 to 7:00 PM –With Steve Ortiz and Marco Zavala Steve Ortiz – Project Advisor Marco Zavala – Previous Project Owner

Scheduling – Gant Chart

Pert Chart