 INTRODUCTION  STEPS OF GESTURE RECOGNITION  TRACKING TECHNOLOGIES  SPEECH WITH GESTURE  APPLICATIONS.

Slides:



Advertisements
Similar presentations
ARTIFICIAL PASSENGER.
Advertisements

人機介面 Gesture Recognition
R OLE OF I NFORMATION AND C OMMUNICATION TECHNOLOGY (ICT) IN LIFE OF P ERSONS WITH L OCOMOTOR D ISABILITY Dr. Dharmendra Kumar Director Pandit Deendayal.
Digital Interactive Entertainment Dr. Yangsheng Wang Professor of Institute of Automation Chinese Academy of Sciences
Designing Facial Animation For Speaking Persian Language Hadi Rahimzadeh June 2005.
Department of Electrical and Computer Engineering He Zhou Hui Zheng William Mai Xiang Guo Advisor: Professor Patrick Kelly ASLLENGE.
A Colour Face Image Database for Benchmarking of Automatic Face Detection Algorithms Prag Sharma, Richard B. Reilly UCD DSP Research Group This work is.
MUltimo3-D: a Testbed for Multimodel 3-D PC Presenter: Yi Shi & Saul Rodriguez March 14, 2008.
Virtual Reality. What is virtual reality? a way to visualise, manipulate, and interact with a virtual environment visualise the computer generates visual,
Real Time Gesture Recognition of Human Hand Wu Hai Atid Shamaie Alistair Sutherland.
CS335 Principles of Multimedia Systems Multimedia and Human Computer Interfaces Hao Jiang Computer Science Department Boston College Nov. 20, 2007.
A Vision-Based System that Detects the Act of Smoking a Cigarette Xiaoran Zheng, University of Nevada-Reno, Dept. of Computer Science Dr. Mubarak Shah,
Recognizing Emotions in Facial Expressions
User Interface Development Human Interface Devices User Technology User Groups Accessibility.
Nonverbal Communication
 Introduction  Devices  Technology – Hardware & Software  Architecture  Applications.
(CONTROLLER-FREE GAMING
   Input Devices Main Memory Backing Storage PROCESSOR
Gesture Recognition Using Laser-Based Tracking System Stéphane Perrin, Alvaro Cassinelli and Masatoshi Ishikawa Ishikawa Namiki Laboratory UNIVERSITY OF.
The Camera Mouse: Visual Tracking of Body Features to Provide Computer Access for People With Severe Disabilities.
Human Emotion Synthesis David Oziem, Lisa Gralewski, Neill Campbell, Colin Dalton, David Gibson, Barry Thomas University of Bristol, Motion Ripper, 3CR.
Nonverbal Communication
Irfan Essa, Alex Pentland Facial Expression Recognition using a Dynamic Model and Motion Energy (a review by Paul Fitzpatrick for 6.892)
Multimedia Specification Design and Production 2013 / Semester 2 / week 8 Lecturer: Dr. Nikos Gazepidis
Zhengyou Zhang Microsoft Research Digital Object Identifier: /MMUL Publication Year: 2012, Page(s): Professor: Yih-Ran Sheu Student.
11.10 Human Computer Interface www. ICT-Teacher.com.
GENERAL PRESENTATION SUBMITTED BY:- Neeraj Dhiman.
Graphite 2004 Statistical Synthesis of Facial Expressions for the Portrayal of Emotion Lisa Gralewski Bristol University United Kingdom
A Method for Hand Gesture Recognition Jaya Shukla Department of Computer Science Shiv Nadar University Gautam Budh Nagar, India Ashutosh Dwivedi.
Interactive Spaces Huantian Cao Department of Computer Science The University of Georgia.
GENESIS OF VIRTUAL REALITY  The term ‘Virtual reality’ (VR) was initially coined by Jaron Lanier, founder of VPL Research (1989)..
Access Control Via Face Recognition. Group Members  Thilanka Priyankara  Vimalaharan Paskarasundaram  Manosha Silva  Dinusha Perera.
卓越發展延續計畫分項三 User-Centric Interactive Media ~ 主 持 人 : 傅立成 共同主持人 : 李琳山,歐陽明,洪一平, 陳祝嵩 水美溫泉會館研討會
Non Verbal Communication How necessary is it to use and interpret it? Demosthenous Christiana.
Communication Additional Notes. Communication Achievements 7% of all communication is accomplished Verbally. 55% of all communication is achieved through.
1 Perception and VR MONT 104S, Fall 2008 Lecture 21 More Graphics for VR.
1 Artificial Intelligence: Vision Stages of analysis Low level vision Surfaces and distance Object Matching.
KAMI KITT ASSISTIVE TECHNOLOGY Chapter 7 Human/ Assistive Technology Interface.
Action and Gait Recognition From Recovered 3-D Human Joints IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS— PART B: CYBERNETICS, VOL. 40, NO. 4, AUGUST.
1 Human Computer Interaction Week 5 Interaction Devices and Input-Output.
Rick Parent - CIS681 Motion Analysis – Human Figure Processing video to extract information of objects Motion tracking Pose reconstruction Motion and subject.
Nonverbal communication
English for communication studies III Semester 2: Spring 2010 Instructor: Stavroulla Hadjiconstantinou Angelidou Nectaria Papaneocleous.
Hand Gesture Recognition Using Haar-Like Features and a Stochastic Context-Free Grammar IEEE 高裕凱 陳思安.
Gesture Recognition 12/3/2009.
Face Recognition Summary –Single pose –Multiple pose –Principal components analysis –Model-based recognition –Neural Networks.
Product: Microsoft Kinect Team I Alex Styborski Brandon Sayre Brandon Rouhier Section 2B.
1 INTRODUCTION TO COMPUTER GRAPHICS. Computer Graphics The computer is an information processing machine. It is a tool for storing, manipulating and correlating.
Over the recent years, computer vision has started to play a significant role in the Human Computer Interaction (HCI). With efficient object tracking.
TOUCHLESS TOUCHSCREEN USER INTERFACE
TOUCHLESS TOUCH SCREEN USER INTERFACE
Computer Graphics Lecture 1 Introduction to Computer Graphics
Lesson 4 Alternative Methods Of Input.
Hand Gestures Based Applications
Alternative Methods Of Input
San Diego May 22, 2013 Giovanni Saponaro Giampiero Salvi
11.10 Human Computer Interface
GESTURE RECOGNITION TECHNOLOGY
TOUCHLESS TOUCHSCREEN USER INTERFACE
Chapter 5 - Input.
Lesson 4 Alternative Methods Of Input.
Introduction to Computers
Higher School of Economics , Moscow, 2016
Virtual Reality.
Lesson 4 Alternative Methods Of Input.
Introduction to Computers
Chapter 9 System Control
Higher School of Economics , Moscow, 2016
Higher School of Economics , Moscow, 2016
Presentation transcript:

 INTRODUCTION  STEPS OF GESTURE RECOGNITION  TRACKING TECHNOLOGIES  SPEECH WITH GESTURE  APPLICATIONS

Gestures are expressive, meaningful body motions – i.e., physical movements of the fingers, hands, arms, head, face, or body with the intent to convey information or interact with the environment.

Mood and emotion are expressed by body language. Facial expressions. Tone of voice. Allows computers to interact with human beings in a more natural way. Allows control without having to touch the device.

Replace mouse and keyboard. Pointing gestures. Navigate in a virtual environment. Pick up and manipulate virtual objects. Interact with a 3D world. No physical contact with computer. Communicate at a distance.

1.DATAGLOVES / CYBERGLOVES - Use of gloves equipped with sensors. - Use of fiber optic cables.

5000 gestures in vocabulary. each gesture consists of a hand shape, a hand motion and a location in 3D space. AFC

Colour Segment Noise Removal Scale by Area THE PROCESS

2.COMPUTER-VISION TECHNOLOGY. USE OF CAMERAS -DEPTH CAMERAS. -STEREO CAMERAS. -NORMAL CAMERAS.

Here the index finger is recognized and when extended, becomes a drawing tool. Here, text is entered by pointing at the character desired Here the index fingers and thumbs of the two hands are recognized and are used to control the shape of the object being defined

ABC Y Yes/No?

We need to search thousands of images. How to do this efficiently? We need to use a “coarse-to-fine”search strategy.

Original image Blurring Factor = 1 Blurring Factor = 2 Blurring Factor = 3

Factor = 3.0 Factor = 2.0 Factor = 1.0

Hidden Markov Model ( HMM ) --- time sequence of images modeling HMM1 (Hello) HMM2 (Good) HMM3(Bad) HMM4 (House) P(f |HMM1) f P(f |HMM2)

Given previous frames we can predict what will happen next Speeds up search. occlusions -

In fluent dialogue signs are modified by preceding and following signs. intermediate forms A B

Single pose  Standard head-and-shoulders view with uniform background  Easy to find face within image

Alignment Faces in the training set must be aligned with each other to remove the effects of translation, scale, rotation etc. It is easy to find the position of the eyes and mouth and then shift and resize images so that are aligned with each other

Once the images have been aligned you can simply search for the member of the training set which is nearest to the test image. There are a number of measures of distance including Euclidean distance, and the cross- correlation.

PCA reduces the number of dimensions and so the memory requirement is much reduced. The search time is also reduced

The same person may sometimes appear differently due to Beards, moustaches Glasses, Makeup These have to be represented by different ellipsoids.

There are six types of facial expression We could use PCA on the eyes and mouth – so we could have eigeneyes and eigenmouths Anger Fear Disgust Happy Sad Surprise

Heads must now be aligned in 3D world space. Classes now form trajectories in feature space. It becomes difficult to recognise faces because the variation due to pose is greater than the variation between people.

We can fit a model directly to the face image Model consists of a mesh which is matched to facial features such as the eyes, nose, mouth and edges of the face. We use PCA to describe the parameters of the model rather than the pixels.

Voice and gesture compliment each other and form a powerful interface that either a modality alone. Speech and gesture make a more interactive interface. Combining gesture and voice increase recognition accuracy.

Within the media room user can use gesture,speech,eye movements or combination of all three. Example: One application allowed user to manage color coded ship against a map of a carribean. A user just need to point the location and need to say “create a large blue tank”. A blue tank will appear on the location. Media room

 Sign language recognition: gesture recognition software can transcribe the symbols represented through sign language into text.  Control through facial gestures: Controlling a computer through facial gestures is a useful application of gesture recognition for users who may not physically be able to use a mouse or keyboard.  Immersive game technology: Gestures can be used to control interactions within video games to try and make the game player's experience more interactive or immersive.

A person playing game. Computer is responding as per user instruction. A girl is instructing the computer from her body movements.

 Virtual controllers: For systems where the act of finding or acquiring a physical controller could require too much time, gestures can be used as an alternative control mechanism.  Affective computing: In affective computing, gesture recognition is used in the process of identifying emotional expression through computer systems.  Remote control: Through the use of gesture recognition, “remote control with the wave of a hand” of various devices is possible. The signal must not only indicate the desired response, but also which device to be controlled.

Occlusions (Atid). Grammars in Irish Sign Language. --- Sentence Recognition. Body Language.

 Wu yang,vision based gesture recognition lecture notes in artificial intelligence  Wikipedia.