Under Guidance of Mr. A. S. Jalal Associate Professor Dept. of Computer Engineering and Applications GLA University, Mathura Presented by Dev Drume Agrawal.

Slides:



Advertisements
Similar presentations
Real-Time Detection, Alignment and Recognition of Human Faces
Advertisements

The Extended Cohn-Kanade Dataset(CK+):A complete dataset for action unit and emotion-specified expression Author:Patrick Lucey, Jeffrey F. Cohn, Takeo.
FACIAL EMOTION RECOGNITION BY ADAPTIVE PROCESSING OF TREE STRUCTURES Jia-Jun Wong and Siu-Yeung Cho Forensic and Security Lab School of Computer Engineering.
Matthias Wimmer, Ursula Zucker and Bernd Radig Chair for Image Understanding Computer Science Technische Universität München { wimmerm, zucker, radig
Adviser:Ming-Yuan Shieh Student:shun-te chuang SN:M
Robust Object Tracking via Sparsity-based Collaborative Model
Texture Segmentation Based on Voting of Blocks, Bayesian Flooding and Region Merging C. Panagiotakis (1), I. Grinias (2) and G. Tziritas (3)
 INTRODUCTION  STEPS OF GESTURE RECOGNITION  TRACKING TECHNOLOGIES  SPEECH WITH GESTURE  APPLICATIONS.
Face Databases. Databases for Face Recognition The appearance of a face is affected by many factors because of its non-rigidity and complex 3D structure:
Automatic Analysis of Facial Expressions: The State of the Art Automatic Analysis of Facial Expressions: The State of the Art By Maja Pantic, Leon Rothkrantz.
Meta-Cognition, Motivation, and Affect PSY504 Spring term, 2011 March 16, 2010.
Exchanging Faces in Images SIGGRAPH ’04 Blanz V., Scherbaum K., Vetter T., Seidel HP. Speaker: Alvin Date: 21 July 2004.
Recent Developments in Human Motion Analysis
UPM, Faculty of Computer Science & IT, A robust automated attendance system using face recognition techniques PhD proposal; May 2009 Gawed Nagi.
Automatic Pose Estimation of 3D Facial Models Yi Sun and Lijun Yin Department of Computer Science State University of New York at Binghamton Binghamton,
TAUCHI – Tampere Unit for Computer-Human Interaction Automated recognition of facial expressi ns and identity 2003 UCIT Progress Report Ioulia Guizatdinova.
4EyesFace-Realtime face detection, tracking, alignment and recognition Changbo Hu, Rogerio Feris and Matthew Turk.
ENTERFACE ’10 Amsterdam, July-August 2010 Hamdi Dibeklio ğ lu Ilkka Kosunen Marcos Ortega Albert Ali Salah Petr Zuzánek.
EKMAN’S FACIAL EXPRESSIONS STUDY A Demonstration.
- Sridhar Godavarthy. Expressions Microexpressions FACS Evolutionary Psychology Proposed method Outline Video Face Detection, Alignment and Splitting.
Recognizing Emotions in Facial Expressions
1 Modeling Facial Shape and Appearance M. L. Gavrilova.
Facial Feature Detection
1 PSO-based Motion Fuzzy Controller Design for Mobile Robots Master : Juing-Shian Chiou Student : Yu-Chia Hu( 胡育嘉 ) PPT : 100% 製作 International Journal.
Crowdsourcing Game Development for Collecting Benchmark Data of Facial Expression Recognition Systems Department of Information and Learning Technology.
EWatchdog: An Electronic Watchdog for Unobtrusive Emotion Detection based on Usage Analysis Rayhan Shikder Department.
Eigenedginess vs. Eigenhill, Eigenface and Eigenedge by S. Ramesh, S. Palanivel, Sukhendu Das and B. Yegnanarayana Department of Computer Science and Engineering.
Zhengyou Zhang Microsoft Research Digital Object Identifier: /MMUL Publication Year: 2012, Page(s): Professor: Yih-Ran Sheu Student.
GESTURE ANALYSIS SHESHADRI M. (07MCMC02) JAGADEESHWAR CH. (07MCMC07) Under the guidance of Prof. Bapi Raju.
Project title : Automated Detection of Sign Language Patterns Faculty: Sudeep Sarkar, Barbara Loeding, Students: Sunita Nayak, Alan Yang Department of.
Micro expression Detection using Strain Patterns - Sridhar Godavarthy Based on V.Manohar, D.B. Goldgof, S.Sarkar, Y. Zhang, "Facial Strain Pattern as a.
Multimodal Information Analysis for Emotion Recognition
Overview of Part I, CMSC5707 Advanced Topics in Artificial Intelligence KH Wong (6 weeks) Audio signal processing – Signals in time & frequency domains.
1 Ying-li Tian, Member, IEEE, Takeo Kanade, Fellow, IEEE, and Jeffrey F. Cohn, Member, IEEE Presenter: I-Chung Hung Advisor: Dr. Yen-Ting Chen Date:
University of Coimbra ISR – Institute of Systems and Robotics University of Coimbra - Portugal Institute of Systems and Robotics
The Expression of Emotion: Nonverbal Communication.
Real-Time Detection, Alignment and Recognition of Human Faces Rogerio Schmidt Feris Changbo Hu Matthew Turk Pattern Recognition Project June 12, 2003.
Action and Gait Recognition From Recovered 3-D Human Joints IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS— PART B: CYBERNETICS, VOL. 40, NO. 4, AUGUST.
Efficient Visual Object Tracking with Online Nearest Neighbor Classifier Many slides adapt from Steve Gu.
Chapter 5 Multi-Cue 3D Model- Based Object Tracking Geoffrey Taylor Lindsay Kleeman Intelligent Robotics Research Centre (IRRC) Department of Electrical.
Intelligent Control and Automation, WCICA 2008.
Automatic Facial Emotion Recognition Aitor Azcarate Felix Hageloh Koen van de Sande Roberto Valenti Supervisor: Nicu Sebe.
3D Face Recognition Using Range Images
Face Image-Based Gender Recognition Using Complex-Valued Neural Network Instructor :Dr. Dong-Chul Kim Indrani Gorripati.
A Framework with Behavior-Based Identification and PnP Supporting Architecture for Task Cooperation of Networked Mobile Robots Joo-Hyung Kiml, Yong-Guk.
Chris Hewitt, Wild Mouse Male, Age 42, Happy ARC31 2.
A REAL-TIME DEFORMABLE DETECTOR 謝汝欣 OUTLINE  Introduction  Related Work  Proposed Method  Experiments 2.
Ekman’s Facial Expressions Study A Demonstration.
Facial Expressions and Emotions Mental Health. Total Participants Adults (30+ years old)328 Adults (30+ years old) Adolescents (13-19 years old)118 Adolescents.
TUMOR BURDEN ANALYSIS ON CT BY AUTOMATED LIVER AND TUMOR SEGMENTATION RAMSHEEJA.RR Roll : No 19 Guide SREERAJ.R ( Head Of Department, CSE)
Finding Clusters within a Class to Improve Classification Accuracy Literature Survey Yong Jae Lee 3/6/08.
Workshop on Machine Intelligence & Data Science Education in Karnataka : A dialogue between stakeholders Dr. H S Guruprasad Professor and Head, Dept of.
Interpreting Ambiguous Emotional Expressions Speech Analysis and Interpretation Laboratory ACII 2009.
Evaluation of Gender Classification Methods with Automatically Detected and Aligned Faces Speaker: Po-Kai Shen Advisor: Tsai-Rong Chang Date: 2010/6/14.
Face Detection 蔡宇軒.
Unique featural difference for happy and fear (in top down and middle out) and for happy and disgust (for bottom up): For fear, eyes are open and tense.
Project GuideBenazir N( ) Mr. Nandhi Kesavan RBhuvaneshwari R( ) Batch no: 32 Department of Computer Science Engineering.
Applications · E-learning apps, changing the presentation style of an e-lesson according to the mood of the person. · Psychological health services, to.
Mood Detection.
Modeling Facial Shape and Appearance
Guillaume-Alexandre Bilodeau
AHED Automatic Human Emotion Detection
Proposed architecture of a Fully Integrated
The Extended Cohn-Kanade Dataset (CK+): A complete dataset for action unit and emotion-specified expression By: Patrick Lucey, Jeffrey F. Cohn, Takeo.
Domingo Mery Department of Computer Science
AHED Automatic Human Emotion Detection
AHED Automatic Human Emotion Detection
Domingo Mery Department of Computer Science
Image Processing and Multi-domain Translation
Presentation transcript:

Under Guidance of Mr. A. S. Jalal Associate Professor Dept. of Computer Engineering and Applications GLA University, Mathura Presented by Dev Drume Agrawal M.Tech. (CSE) GLA University Mathura

Content: Abstract Introduction Motivation Literature Review Issues & Challenges Objective and Proposed Framework Summary References

INTRODUCTION  FERS is an automated system, works based on training data.  It is very useful tool for human and machine interaction.  The FERS works on: Image Sequence of Image (or a segment of video)  It works on 2D images.

Introduction (Cont….) The FERS recognize six basic emotions [1] represented as ‘universal facial expressions’ by Ekman, et al [2]. happiness, anger sadness surprise disgust fear Fig 1.1: Six universal emotions (anger, happiness, sadness, surprise, fear and disgust) [f1]

MOTIVATION The FERS is the fast growing tool with different more or less functionalities [3] such as  Real time automated system,  Gesture recognition,  Recognition from video  recognition of facial emotion affected by audio input etc.

Motivation (Cont…) FERS can be used as a tool in various integrated systems which are useful for the human and machine interaction like  Robots  Investigations by intelligence agencies etc.

LITERATURE SURVEY Ekman & Friesen, (1978) [4] A technique “facial action coding system (FACS)” for measuring facial movements in behavioral science, was developed. Ekman & Rosenberg (1997) [5] FACS uses 46 defined action units to correspond into each independent motion of the face

Literature Survey (Cont…) Kwang-Eun Ko & Kwee-Bo Sim, (2009) [6] The advance active appearance model was developed for emotion recognition and used with Bayesian network. Kwang-Eun Ko & Kwee-Bo Sim, (2010) [7] The AAM (Active appearance model) and DBN (Dynamic bayesian network) was combined for emotion recognition.

Literature Survey (Cont…) Yoshihiro Miyakoshi, and Shohei Kato, (2011) [8] A human facial emotion recognition system was developed using Fuzzy logic. Kumar at. el., (2009) [9] A system was developed using bayesian network with partial occluding problem handling.

Literature Survey (Cont…) Jia-Jun Wong & Siu-Yeung Cho, (2009) [10] A model was developed for facial emotion recognition using local expert organization (LEO) technique. B. Fasel & Juergen Luettinb, (2003) [11] This survey paper addresses the techniques and approaches used for facial expression detection.

ISSUES AND CHALLENGES In the all above systems/tools and used algorithms some issues are there, which are the major factor, responsible for inaccuracy in the performance and desired result of the systems. Following are some major issues:  Pose  Illumination  Partial Occlusion  Facial Texture Deformation (Caused by age or physical disability

Issues and Challenges (Cont…) 1. POSE Fig 4.1: Pose variation [f2]

Issues and Challenges (Cont…) 2. ILLUMINATION Fig 4.2: Illumination variation [f3]

Issues and Challenges (Cont…) 3. PARTIAL OCCLUSION Fig 4.3: Partial Occlusion [f4]

Issues and Challenges (Cont…) 4. Facial Texture Deformation (Caused by age or physical disability Fig 4.2: Facial Texture Deformation [f5]

OBJECTIVE AND PROPOSED METHODOLOGY/FRAMEWORK Based on the previous works we are preparing an automated system for emotion recognition of facial images, having three major modules:  Face detection  Facial features extraction  Facial emotion recognition

Development & Implementation Framework Data set Test images Training images Face Detection Image Segmentation Facial Feature Extraction Classification Training By a classifier Capture Image Emotion Fig 5.2: Methodology of the system

REFERENCES [1] [2] P. Ekman, W.V. Friesen, Constants across cultures in the face and emotion, J. Personality Social Psychol. 17 (2) (1971) 124–129. [3] [4] Ekman and W. Friesen, Facial Action Coding System: A Technique for the Measurement of Facial Movement, Consulting Psychologists Press, Palo Alto, [5] Ekman, P., & Rosenberg, E. L. (1997). What the face reveals: Basic and studies of spontaneous expression using the facial action coding system (FACS). New York: Oxford University Press. [6] Kwang-Eun Ko, Kwee-Bo Sim, “Development of Advanced Active Appearance Model for Facial Emotion Recognition”, IEEE International Symposium on Industrial Electronics (ISlE 2009). [7] Kwang-Eun Ko, Kwee-Bo Sim,(2010) “Development of a Facial Emotion Recognition Method based on combining AAM with DBN”, IEEE International Conference on Cyberworlds, 2009.

REFERENCES (Cont…) [8] Yoshihiro M., and Shohei K. (2011) ”Facial Emotion Detection Considering Partial Occlusion of Face Using Bayesian Network”, IEEE Symposium on Computers & Informatics, [9] Kumar U. & Chakraborty A. (2009), “Emotion Recognition From Facial Expressions and Its Control Using Fuzzy Logic”, IEEE Transaction on Systems, Man & Cybernetics- part A: Systems and Humans, Vol. 39 [10] Jia-Jun W. & Siu-Yeung C., (2009)“A local experts organization model with application to face emotion recognition”, Elsevier Transaction on Expert System with Application’, vol. 36, pp [11] Fasel B. & Juergen L.,(2003) “Automatic facial expression analysis: a survey”, Elsevier Transaction on ‘Pattern Recognition’ vol. 36, pp [f1] Jia-Jun W. & Siu-Yeung C., (2009)“A local experts organization model with application to face emotion recognition”, Elsevier Transaction on Expert System with Application’, vol. 36, pp [f2] [f3] [f4] [f5]

THANK YOU