Gesture Recognition 12/3/2009.

Slides:



Advertisements
Similar presentations
1 Human Gesture Recognition by Mohamed Bécha Kaâniche 11/02/2009.
Advertisements

人機介面 Gesture Recognition
COMMUNICATION.
Speech and Gesture Corpus From Designing to Piloting Gheida Shahrour Supervised by Prof. Martin Russell Dr Neil Cooke Electronic, Electrical and Computer.
Nonverbal Communication. A. General Information 1.Definition – All the behaviors and elements of people, other than words, that convey meaning 2. At least.
M. Emre Sargın, Ferda Ofli, Yelena Yasinnik, Oya Aran, Alexey Karpov, Stephen Wilson,Engin Erzin, Yücel Yemez, A. Murat Tekalp Combined Gesture- Speech.
 INTRODUCTION  STEPS OF GESTURE RECOGNITION  TRACKING TECHNOLOGIES  SPEECH WITH GESTURE  APPLICATIONS.
SOMM: Self Organizing Markov Map for Gesture Recognition Pattern Recognition 2010 Spring Seung-Hyun Lee G. Caridakis et al., Pattern Recognition, Vol.
Recent Developments in Human Motion Analysis
Cindy Song Sharena Paripatyadar. Use vision for HCI Determine steps necessary to incorporate vision in HCI applications Examine concerns & implications.
CS335 Principles of Multimedia Systems Multimedia and Human Computer Interfaces Hao Jiang Computer Science Department Boston College Nov. 20, 2007.
Non-invasive Techniques for Human Fatigue Monitoring Qiang Ji Dept. of Electrical, Computer, and Systems Engineering Rensselaer Polytechnic Institute
Body Language. Bodily Speaking…  In a normal conversation between two persons, less than 35% of the social meanings is actually transmitted by words.
Nonverbal Communication
HAND GESTURE BASED HUMAN COMPUTER INTERACTION. Hand Gesture Based Applications –Computer Interface A 2D/3D input device (Hand Tracking) Translation of.
Nonverbal Communication
Irfan Essa, Alex Pentland Facial Expression Recognition using a Dynamic Model and Motion Energy (a review by Paul Fitzpatrick for 6.892)
It’s what you say… and what you don’t Nonverbal Messages in Communication.
NON-VERBAL COMMUNICATION
Knowledge Systems Lab JN 9/10/2002 Computer Vision: Gesture Recognition from Images Joshua R. New Knowledge Systems Laboratory Jacksonville State University.
Multimedia Specification Design and Production 2013 / Semester 2 / week 8 Lecturer: Dr. Nikos Gazepidis
Kinesics The study of body movements The study of body movements.
Hand Gesture Recognition System for HCI and Sign Language Interfaces Cem Keskin Ayşe Naz Erkan Furkan Kıraç Özge Güler Lale Akarun.
Abstract Developing sign language applications for deaf people is extremely important, since it is difficult to communicate with people that are unfamiliar.
Chapter 7. BEAT: the Behavior Expression Animation Toolkit
A Method for Hand Gesture Recognition Jaya Shukla Department of Computer Science Shiv Nadar University Gautam Budh Nagar, India Ashutosh Dwivedi.
Project title : Automated Detection of Sign Language Patterns Faculty: Sudeep Sarkar, Barbara Loeding, Students: Sunita Nayak, Alan Yang Department of.
資訊工程系智慧型系統實驗室 iLab 南台科技大學 1 A Static Hand Gesture Recognition Algorithm Using K- Mean Based Radial Basis Function Neural Network 作者 :Dipak Kumar Ghosh,
Sign Language – 5 Parameters Sign Language has an internal structure. They can be broken down into smaller parts. These parts are called the PARAMETERS.
22CS 338: Graphical User Interfaces. Dario Salvucci, Drexel University. Lecture 10: Advanced Input.
ECE 8443 – Pattern Recognition EE 3512 – Signals: Continuous and Discrete Objectives: Spectrograms Revisited Feature Extraction Filter Bank Analysis EEG.
Template matching and object recognition. CS8690 Computer Vision University of Missouri at Columbia Matching by relations Idea: –find bits, then say object.
Head Tracking in Meeting Scenarios Sascha Schreiber.
Tracking CSE 6367 – Computer Vision Vassilis Athitsos University of Texas at Arlington.
ENTERFACE 08 Project 1 “MultiParty Communication with a Tour Guide ECA” Mid-term presentation August 19th, 2008.
Designing for energy-efficient vision-based interactivity on mobile devices Miguel Bordallo Center for Machine Vision Research.
Model of the Human  Name Stan  Emotion Happy  Command Watch me  Face Location (x,y,z) = (122, 34, 205)  Hand Locations (x,y,z) = (85, -10, 175) (x,y,z)
Communication Additional Notes. Communication Achievements 7% of all communication is accomplished Verbally. 55% of all communication is achieved through.
Efficient Visual Object Tracking with Online Nearest Neighbor Classifier Many slides adapt from Steve Gu.
1 Human Computer Interaction Week 5 Interaction Devices and Input-Output.
4 November 2000Bridging the Gap Workshop 1 Control of avatar gestures Francesca Barrientos Computer Science Division UC Berkeley.
Rick Parent - CIS681 Motion Analysis – Human Figure Processing video to extract information of objects Motion tracking Pose reconstruction Motion and subject.
Communication Though Nonverbal Behavior. Def.- bodily actions and vocal qualities that typically accompany a verbal message. They are usually interpreted.
Animated Speech Therapist for Individuals with Parkinson Disease Supported by the Coleman Institute for Cognitive Disabilities J. Yan, L. Ramig and R.
Chapter 3: Nonverbal Communication. Body Language Multi-channeledEmphatic gestures Descriptive gesturesPosture StanceProxemics Communication imperativeMannerism.
Hand Gesture Recognition Using Haar-Like Features and a Stochastic Context-Free Grammar IEEE 高裕凱 陳思安.
Team Members Ming-Chun Chang Lungisa Matshoba Steven Preston Supervisors Dr James Gain Dr Patrick Marais.
Ta-when-tee-won. Voice Paralanguage – vocal qualities “assist” language Disfluencies – disruptions in the flow of words – Verbal Junk (um, uh, like, and.
CONTENT FOCUS FOCUS INTRODUCTION INTRODUCTION COMPONENTS COMPONENTS TYPES OF GESTURES TYPES OF GESTURES ADVANTAGES ADVANTAGES CHALLENGES CHALLENGES REFERENCE.
Multimedia Systems and Communication Research Multimedia Systems and Communication Research Department of Electrical and Computer Engineering Multimedia.
Face Recognition Summary –Single pose –Multiple pose –Principal components analysis –Model-based recognition –Neural Networks.
Interaction Technology 2016
Over the recent years, computer vision has started to play a significant role in the Human Computer Interaction (HCI). With efficient object tracking.
Under Guidance of Mr. A. S. Jalal Associate Professor Dept. of Computer Engineering and Applications GLA University, Mathura Presented by Dev Drume Agrawal.
Speech Recognition
Hand Gestures Based Applications
Automated Detection of Human Emotion
Body Language The author: Ilyushkina N. M. Gymnasium-17.
CALO VISUAL INTERFACE RESEARCH PROGRESS
Gait Recognition Gökhan ŞENGÜL.
CS201 Lecture 02 Computer Vision: Image Formation and Basic Techniques
Chapter 15 Gestures and Sign Languages
GESTURE RECOGNITION TECHNOLOGY
Video-based human motion recognition using 3D mocap data
Properties of human stereo processing
What is blue eyes ? aims on creating computational machines that have perceptual and sensory ability like those of human beings. interactive computer.
Automated Detection of Human Emotion
Chapter 9 System Control
Presentation transcript:

Gesture Recognition 12/3/2009

Demo Some hand gestures Opera Face Gesture Eye tracking 3D UI Hitachi gesture operation TV Hand gesture recognition Stereo finger tracking OpenCV: http://sourceforge.net/projects/opencvlibrary/

The Nature of Gesture Gestures are expressive, meaningful body motions, i.e., physical movements of the fingers, hands, arms, head, face, or body with the intent to convey information or interact with the environment.

Functional Roles of Gesture Semiotic: to communicate meaningful information. Ergotic: to manipulate the environment. Epistemic: to discover the environment through tactile experience.

Gesture vs. Posture Posture refers to static position, configuration, or pose. Gesture involves movement. Dynamic gesture recognition requires consideration of temporal events. This is typically accomplished through the use of techniques such as time-compressing templates, dynamic time warping, hidden Markov models (HMMs), and Bayesian networks.

Five Types of Gestures According Kendo [1972], Gesticulation. Spontaneous movements of the hands and arms that accompany speech. Language-like gestures. Gesticulation that is integrated into a spoken utterance, replacing a particular spoken word or phrase. Pantomimes. Gestures that depict objects or actions, with or without accompanying speech. Emblems. Familiar gestures such as “V for victory, thumbs up‥, and assorted rude gestures (these are often culturally specific). Sign languages. Linguistic systems, such as American Sign Language, which are well defined.

Examples Pen-based gesture recognition Tracker-based gesture recognition Instrumented gloves Body suits Passive vision-based gesture recognition Head and face gestures Hand and arm gestures Body gestures

Vision-based Gesture Recognition Advantages: Passive and non-obtrusive Low-cost Challenges: Efficiency: Can we process 30 frames of image per second? Accuracy: Can we maintain robustness with changing environment? Occulsion: can only see from a certain point of view. Multiple cameras create integration and correspondence issues.

Gesture Recognition System

Issues Number of cameras. How many cameras are used? If more than one, are they combined early (stereo) or late (multi-view)? Speed and latency. Is the system real-time (i.e., fast enough, with low enough latency interaction)? Structured environment. Are there restrictions on the background, the lighting, the speed of movement, etc.? User requirements. Must the user wear anything special (e.g., markers, gloves, long sleeves)? Anything disallowed (e.g., glasses, beard, rings)? Primary features. What low-level features are computed (edges, regions, silhouettes, moments, histograms, etc.)? Two- or three-dimensional representation. Representation of time: How is the temporal aspect of gesture represented and used in recognition?

Tools for Gesture Recognition Static gesture (pose) recognition Template matching Neural networks Pattern recognition techniques Dynamic gesture recognition Time compressing templates Dynamic time warping Hidden Markov Models Conditional random fields Time-delay neural networks Particle filtering and condensation algorithm Finite state machine

Head and Face Gestures Nodding or shaking the head; Direction of eye gaze; Raising the eyebrows; Opening the mouth to speak; Winking; Flaring the nostrils; Facial expression: looks of surprise, happiness, disgust, anger, sadness, etc.

Hand Gesture Vision-based Hand Gesture Recognition Systems: Research at VIP Lab NTU thesis A Hand Gesture Recognition System for Replacing a Mouse

Body Gesture Human dynamics: tracking full body motion, recognizing body gestures, and recognizing human activity. Activity may be defined over a much longer period of time than what is normally considered a gesture; for example, two people meeting in an open area, stopping to talk, and then continuing on their way may be considered a recognizable activity. Bobick (1997) proposed a taxonomy of motion understanding in terms of: Movement. The atomic elements of motion. Activity. A sequence of movements or static configurations. Action. High-level description of what is happening in context.

Examples Monitoring human activity 電腦視覺監控產學聯盟 Human Activity Recognition: A Grand Challenge

Suggestions for System Design (I) Do inform the user. Do give the user feedback. Do take advantage of the uniqueness of gesture. Do understand the benefits and limits of the technology Do usability testing on the system Do avoid temporal segmentation if feasible

Suggestions for System Design (II) Don’t tire the user. Don’t make the gestures to be recognized too similar. Don’t use gesture as a gimmick. Don’t increase the user’s cognitive load. Don’t require precise motion. Don’t create new, unnatural gestural language.