Speech and Gesture Corpus From Designing to Piloting Gheida Shahrour Supervised by Prof. Martin Russell Dr Neil Cooke Electronic, Electrical and Computer.

Slides:



Advertisements
Similar presentations
Team:. Prepared By: Menna Hamza Mohamed Mohamed Hesham Fadl Mona Abdel Mageed El-Koussy Yasmine Shaker Abdel Hameed Supervised By: Dr. Magda Fayek.
Advertisements

Virtual Me. Motion Capture The process of recording movement and translating that movement onto a digital model Originally used for military tracking.
Motion Capture The process of recording movement and translating that movement onto a digital model Games Fast Animation Movies Bio Medical Analysis VR.
Virtual Me. Motion Capture (mocap) Motion capture is the process of simulating actual movement in a computer generated environment The capture subject.
Prepared By: Menna Hamza Mohamed Mohamed Hesham Fadl Mona Abdel Mageed El-Koussy Yasmine Shaker Abdel Hameed Supervised By: Dr. Magda Fayek.
MOTION CAPTURE IN LIFE SCIENCES Mario Lamontagne.
Lecture 11. Microscopy. Optical or light microscopy involves passing visible light transmitted through or reflected from the sample through a single or.
Electronic Pitch Trainer Abstract: A baseball pitch has many properties that vary from pitch-to-pitch. Some of the more apparent properties are the release.
 INTRODUCTION  STEPS OF GESTURE RECOGNITION  TRACKING TECHNOLOGIES  SPEECH WITH GESTURE  APPLICATIONS.
Department of Electrical and Computer Engineering He Zhou Hui Zheng William Mai Xiang Guo Advisor: Professor Patrick Kelly ASLLENGE.
Overview: The system contains 3 main components: the tag, the video capturing and processing unit (VCPU), and the software. The tag consists of an IRED.
WSCG’07 Jonathan Kipling Knight 1 Feb 2007 Copyright © Jonathan Kipling Knight 2007 Fast Skeleton Estimation from Motion Capture Data using Generalized.
Move With Me S.W Graduation Project An Najah National University Engineering Faculty Computer Engineering Department Supervisor : Dr. Raed Al-Qadi Ghada.
Tracking Migratory Birds Around Large Structures Presented by: Arik Brooks and Nicholas Patrick Advisors: Dr. Huggins, Dr. Schertz, and Dr. Stewart Senior.
3D Measurements by PIV  PIV is 2D measurement 2 velocity components: out-of-plane velocity is lost; 2D plane: unable to get velocity in a 3D volume. 
Passive Object Tracking from Stereo Vision Michael H. Rosenthal May 1, 2000.
Drawing a Skeleton Fast From Motion Capture Data Jonathan Kipling Knight Nov 7, 2006.
Tracking Gökhan Tekkaya Gürkan Vural Can Eroğul. Outline Tracking –Overview –Head Tracking –Eye Tracking –Finger/Hand Tracking Demos.
Plan for today Discuss your assignments detailed on the last slide of the powerpoint for last week on: –Topics/problems in which you are most interested.
Rowing Motion Capture System Simon Fothergill Ph.D. student, Digital Technology Group, Computer Laboratory Jesus College graduate conference May 2009.
Optical Motion Capture Bobby Bruckart Ben Heipp James Martin Molly Shelestak.
I mage and M edia U nderstanding L aboratory for Performance Evaluation of Vision-based Real-time Motion Capture Naoto Date, Hiromasa Yoshimoto, Daisaku.
Introducing the LEO 1400 Series
Introduction to Motion Video Analysis LEE Kai-chiu Seconded Teacher PE SECTION, CDI Jan 2009.
Motion Capture Hardware
Performance Analysis of an Optoelectronic Localization System for Monitoring Brain Lesioning with Proton Beams Fadi Shihadeh (1), Reinhard Schulte (2),
VE Input Devices(I) Doug Bowman Virginia Tech Edited by Chang Song.
1 Lecture 19: Motion Capture. 2 Techniques Morphing Motion Capture.
Prepared By: Menna Hamza Mohamed Mohamed Hesham Fadl Mona Abdel Mageed El-Koussy Yasmine Shaker Abdel Hameed Supervised By: Dr. Magda Fayek.
Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.
A Method for Hand Gesture Recognition Jaya Shukla Department of Computer Science Shiv Nadar University Gautam Budh Nagar, India Ashutosh Dwivedi.
MODERN SURVEY (FAMILARISATION WITH EQUIPMENTS). Modern equipments EDM – Electronic distance measurement eqp. EDM – Electronic distance measurement eqp.
Vanderbilt University Department of Biomedical Engineering  
SS5305 – Motion Capture Initialization 1. Objectives Camera Setup Data Capture using a Single Camera Data Capture using two Cameras Calibration Calibration.
Hand Tracking for Virtual Object Manipulation
High-Resolution Interactive Panoramas with MPEG-4 발표자 : 김영백 임베디드시스템연구실.
An Information Fusion Approach for Multiview Feature Tracking Esra Ataer-Cansizoglu and Margrit Betke ) Image and.
SS5305 – Advanced Motion Capture 1 Tutor: Mr. Owais Malik and Mr. Joko Triloka Room No: 2.31
Vision-based human motion analysis: An overview Computer Vision and Image Understanding(2007)
Vehicle Segmentation and Tracking From a Low-Angle Off-Axis Camera Neeraj K. Kanhere Committee members Dr. Stanley Birchfield Dr. Robert Schalkoff Dr.
KAMI KITT ASSISTIVE TECHNOLOGY Chapter 7 Human/ Assistive Technology Interface.
Rick Parent - CIS681 Motion Analysis – Human Figure Processing video to extract information of objects Motion tracking Pose reconstruction Motion and subject.
Animated Speech Therapist for Individuals with Parkinson Disease Supported by the Coleman Institute for Cognitive Disabilities J. Yan, L. Ramig and R.
Realtime Robotic Radiation Oncology Brian Murphy 4 th Electronic & Computer Engineering.
Counting How Many Words You Read
Tracking Systems in VR.
Gesture Recognition 12/3/2009.
Speed Sensor Calibration
UG/Photo Hong Rujin 15, April. High Quality Image Dialog View Visualization High Quality Image.
SS5305 – Popular Marker Setups 1. Objectives Marker Data Measurement Sequence Project Automation Framework (PAF) Popular marker setups PAF Interface 2.
Quantification of Sensory Abnormalities Client: Dr. Miroslav Backonja Advisor: Prof. Mitchell Tyler Group Members (in order of appearance): Colleen Farrell.
Roller Coaster Speed Sensor Calibration. Lab Safety  Do not stand on chairs, or sit or stand on the tables  Know the location of the first-aid kit 
University of Pennsylvania 1 GRASP Control of Multiple Autonomous Robot Systems Vijay Kumar Camillo Taylor Aveek Das Guilherme Pereira John Spletzer GRASP.
Development of Indian Sign Language Recognition System BY DEEPANAIR.V.S B.S.ANANTHALEKSHMI GAYATHRIMOHAN Guided By DR.DEVARAJ.
EYE TRACKING TECHNOLOGY
Daniel A. Taylor Pitt- Bradford University
Presented by Jason Moore
Musical Instrument Virtual
Speed Sensor Calibration
CAPTURING OF MOVEMENT DURING MUSIC PERFORMANCE
NESTA PROJECT.
Do-It-Yourself Eye Tracker: Impact of the Viewing Angle on the
Jefferson Y. Han, New York University
Vehicle Segmentation and Tracking in the Presence of Occlusions
Higher School of Economics , Moscow, 2016
Tracked Bipolar Stimulator
The Implementation of a Glove-Based User Interface
Assist. Lecturer Safeen H. Rasool Faculty of SCIENCE IT Dept.
Vision Based UAV Landing
Higher School of Economics , Moscow, 2016
Presentation transcript:

Speech and Gesture Corpus From Designing to Piloting Gheida Shahrour Supervised by Prof. Martin Russell Dr Neil Cooke Electronic, Electrical and Computer Engineering University of Birmingham

Motivation Our research focuses on modelling human behaviour from body motion. No dataset which could serve our research focus.

Dataset Specification We need data that: Contains the motion of people’s head, arms and hands Captured from people come from different cultural backgrounds Contains spontaneous speech Captured using a marker-based tracking technique

Why Marker-based Tracking Technique? Capturing people’s gestures is mainly based on computer vision techniques: skin colour- people’s skin & light in images. contour of people- objects may overlap/occluded tracking from sequence of frames-may not be accurate images are from 2D- accuracy issues. To Avoid these problems We will capture gestures using marker-based optical motion tracking: data obtained from 3D coordinate system less occlusion & recovered easily tracking the object accurately- good calibration tracking the light-reflective markers- accuracy.

Qualisys Track Manager (QTM) The Balance and Posture Laboratory in the School of Psychology equipped with QTM system ( cameras with LED strobes which emits a beam of infrared light which is not visible to the naked eye. 2. QTM Software & Analogue Interface for recording speech 3. passive markers- different sizes 4. calibration Kit: axis L shape & wand T shape.

Camera & Strobe

How it works? 1. The spherical markers are coated with a material to amplify their brightness. 2. The strobes project light towards the markers and the markers reflect it back to the camera 3. Then the camera system measures a 2-dimensional position of the reflective target by combining the 2-D data from several cameras. 4. The camera uses the reflected data from multiple cameras to calculate the 3D position of the markers with high spatial resolution..

How it works?

The Process of Capturing data Attach markers on the objects of interest- how? Define the measurement area where subjects will stand Test the area Calibrate the area Capture your data Save your data.

Reprocess Data Files Reprocess the files you captured to construct the 3D view-how?

Labeling Data Label your data – how? 1.Create a text file- Unique name 2.Unique colour 3.Upload the file 4.Drag & drop 5.Play the motion data 6.Play it again 7.Fill the gap 8.Play it again 9.Save the file 10.Export the data

Experiments (1)_Methods & Materials 2 volunteers each wears 36 7mm flat-based half spherical markers on: - head(4) - elbows(2) - waist(4) - golf gloves(26). 12 cameras & measurement volume is not specified frame rate: 200 frames per second speech is not recorded.

Experiments (1)3D View

Experiments (1)Best Result

Experiment (2)_Motivation To improve the quality of data. 1. Quantity: number of unidentified markers’ trajectories should be the same number of the markers used in the experiment. 2. Quality: No loss of markers, ghost markers The technique: the reduction both the number of markers & the measurement volume

Typical 3D Data & Cameras Position

Low Vs High 3D Tracker Parameters Prediction error Residual: the remaining of the trajectory set to low Filling gaps between frames

Markers’ Trajectories & Filling the Gap

Missing Data

How to Fill these Gaps?

Experiments (2)_Methods & Materials 3 volunteers each wears 28 7mm flat-based half spherical markers on: - head(4) - elbows(2) - shoulders(2) - waist(4) - golf gloves(16)

Experiments (2)_ Measurement Volume

Experiments (1)_Cameras Position

Experiments (2)_Cameras Position

Experiments (2)_Sessions

Experiments (2)_Result

Conclusion We will track motion of head, arms and hand Leave 3 fingers out: middle, ring and pink. Occlusion of the markers on fingers is not only due to the cameras set up, but also due to the degree of freedom of the hands Finding unidentified trajectories of markers is laborious and time consuming. Tracking all fingers is very useful for many applications such as Sign Language but this is not our focus.

Data collection_ assignment Each volunteer will wear not less than 12mm passive markers on head(4), elbows(2), waist(4), shoulder(3) and gloves(10)

Group Setup Put yourselves into groups of 3. The members of each group should be from the same first language, same gender & same country of birth Each member in British group (country of birth is Britain & first language is English) will record 2 sessions. Each session will last 15 minutes captured in 5 stages. Each stage lasts for 3 minutes. Each member in the cultural group (country of birth is not Britain & first language is not English) will record 4 sessions. 2 sessions in English as a Second Language and 2 in their first language. Each session will last 15 minutes captured in 5 stages. Each stage lasts for 3 minutes.

Any Question?