 ASMARUL SHAZILA BINTI ADNAN 812538.  Word Emotion comes from Latin word, meaning to move out.  Human emotion can be recognize from facial expression,

Slides:



Advertisements
Similar presentations
FACIAL EMOTION RECOGNITION BY ADAPTIVE PROCESSING OF TREE STRUCTURES Jia-Jun Wong and Siu-Yeung Cho Forensic and Security Lab School of Computer Engineering.
Advertisements

NATHAN DE LA CRUZ SUPERVISOR: MEHRDAD GHAZIASGAR MENTORS: DANE BROWN AND DIEGO MUSHFIELDT Lie Detection System Using Facial Expressions.
Revolutionary next generation human computer interaction.
Meta-Cognition, Motivation, and Affect PSY504 Spring term, 2011 March 16, 2010.
Combining Human and Machine Capabilities for Improved Accuracy and Speed in Visual Recognition Tasks Research Experiment Design Sprint: IVS Flower Recognition.
1 Affective Learning with an EEG Approach Xiaowei Li School of Information Science and Engineering, Lanzhou University, Lanzhou, China
Emotional Intelligence and Agents – Survey and Possible Applications Mirjana Ivanovic, Milos Radovanovic, Zoran Budimac, Dejan Mitrovic, Vladimir Kurbalija,
I mage and M edia U nderstanding L aboratory for Performance Evaluation of Vision-based Real-time Motion Capture Naoto Date, Hiromasa Yoshimoto, Daisaku.
Sunee Holland University of South Australia School of Computer and Information Science Supervisor: Dr G Stewart Von Itzstein.
What is Body Language? Facial expressions – eyes and mouth Posture – head, back and shoulders Gestures – hands Stance – arms and legs A way to communicate.
Group-1 Group members- Sadbodh sharma-y07uc101 Kapil Phatnani-y08uc065.
Multimedia Specification Design and Production 2013 / Semester 2 / week 8 Lecturer: Dr. Nikos Gazepidis
Keep Smiling! GRIM GRINS. The Project’s member György Hingyi – programmer & manager1 Péter Szabó – programmer & manager2 Sinan Oz – scientist Krisztina.
1 Ying-li Tian, Member, IEEE, Takeo Kanade, Fellow, IEEE, and Jeffrey F. Cohn, Member, IEEE Presenter: I-Chung Hung Advisor: Dr. Yen-Ting Chen Date:
A Seminar Report On Face Recognition Technology A Seminar Report On Face Recognition Technology 123seminarsonly.com.
卓越發展延續計畫分項三 User-Centric Interactive Media ~ 主 持 人 : 傅立成 共同主持人 : 李琳山,歐陽明,洪一平, 陳祝嵩 水美溫泉會館研討會
 Detecting system  Training system Human Emotions Estimation by Adaboost based on Jinhui Chen, Tetsuya Takiguchi, Yasuo Ariki ( Kobe University ) User's.
Communication Additional Notes. Communication Achievements 7% of all communication is accomplished Verbally. 55% of all communication is achieved through.
Delivering Business Value through IT Face feature detection using Java and OpenCV 1.
GENDER AND AGE RECOGNITION FOR VIDEO ANALYTICS SOLUTION PRESENTED BY: SUBHASH REDDY JOLAPURAM.
Chris Hewitt, Wild Mouse Male, Age 42, Happy ARC31 2.
Jennifer Lee Final Automated Detection of Human Emotion.
Product: Microsoft Kinect Team I Alex Styborski Brandon Sayre Brandon Rouhier Section 2B.
Emotion-aware recommender systems Master’s thesis SupervisorMentorStudent Katrien VerbertKarsten SeippJeroen Reinenbergh.
Presented By Meet Shah. Goal  Automatically predicting the respondent’s reactions (accept or reject) to offers during face to face negotiation by analyzing.
Research Methodology Proposal Prepared by: Norhasmizawati Ibrahim (813750)
Humanoid-Human Interaction Presented by KMR ANIK 1.
Lens Gestures: Integrating Compound Gesture Inputs for Shortcut Activation.
Under Guidance of Mr. A. S. Jalal Associate Professor Dept. of Computer Engineering and Applications GLA University, Mathura Presented by Dev Drume Agrawal.
Modeling Human Emotion during watching movies using EEG Prepared By : Muniratul Husna Bt. Mohamad Sokri Matric No. : Lecturer : Dr. Farzana binti.
Presented By Bhargav (08BQ1A0435).  Images play an important role in todays information because A single image represents a thousand words.  Google's.
Detection Of Anger In Telephone Speech Using Support Vector Machine and Gaussian Mixture Model Prepared By : Siti Marahaini Binti Mahamood.
An Introduction To Public Speaking
Hand Gestures Based Applications
Body Language Key Stage 3 July 2015.
Architectural Design Copyright © 2016 – Curt Hill
CSCI 161: Introduction to Programming Course Introduction
Which main component of fitness do you think this picture shows?
Derek McColl Alexander Hong Naoaki Hatakeyama Goldie Nejat
Automated Detection of Human Emotion
Non-Verbal Communication
AHED Automatic Human Emotion Detection
Session 7: Face Detection (cont.)
Submitted by: Ala Berawi Sujod Makhlof Samah Hanani Supervisor:
ARD Presentation January, 2012 BodyPointer.
Healthy Relationships
When to engage in interaction – and how
ABSTRACT FACE RECOGNITION RESULTS
Non-verbal Communication
EMOTIONAL INTELLIGENCE
Advantages of Using a Power Tower
The design process Software engineering and the design process for interactive systems Standards and guidelines as design rules Usability engineering.
The design process Software engineering and the design process for interactive systems Standards and guidelines as design rules Usability engineering.
Nonverbal Communication
Presentation by Sasha Beltinova
Chao Xu, Parth H. Pathak, et al. HotMobile’15
Presentations Part 3 Lecture 21.
Child Development.
What is blue eyes ? aims on creating computational machines that have perceptual and sensory ability like those of human beings. interactive computer.
Characters, plot, context, stagecraft, quotes
TN Migrant Education Parent Training Program Pre-K Students
Message Passing – a Part of Good Communication
Final Project Presentation | CIS3203
Rowing Technique Technique in detail Good Posture Grip Recovery Catch
Healthy Relationships
Learning Companion Project
Effective Communication
Automated Detection of Human Emotion
Ryan Layer CU Boulder CS Ryan Layer
THE ASSISTIVE SYSTEM SHIFALI KUMAR BISHWO GURUNG JAMES CHOU
Presentation transcript:

 ASMARUL SHAZILA BINTI ADNAN

 Word Emotion comes from Latin word, meaning to move out.  Human emotion can be recognize from facial expression, speech, electroencephalography (EEG), autonomic nervous system (ANS) and body movement.  Emotion detection using body movement still not explored yet, maybe due to lack of formal model for body posture as there are for face such as Facial Action Coding System (FACS) (Kleinsmith, 2011).

 There are such a relationship exist between facial expression and body movement.  Most of papers focus on emotion detection through acted scenarios although people are expresses their emotion though body movement by naturally.  A big gap accuracy between acted and non-acted scenarios in human emotion detection using body movement.

RQ1  What is relationship between facial expression and body movement? RO1  To determine the relationship between facial expression and body movement

RQ2  Why there is a big gap in accuracy between acted and non-acted scenarios in emotion detection using body movement? RO2  To identify the big gap in accuracy between acted and non-acted scenarios in emotion detection using body movement.

RQ3  How to implement non- acted body movement in emotion detection? RO3  To identify the way to implement non-acted body movement in emotion detection.

Playing Nintendo Wii games

Performance Evaluation Recognition Rate Emotion Classification Support Vector Machine Feature Extraction Software EyesWeb Body pre-processing 17 parts of body can be uses for human emotion detection: head, neck, arms, spine, shoulders, forearms, wrists, upper-legs, knees and feet. Input Data The subjects will ask to play Nintendo Wii games about 15 minutes while being record with the motion capture system and by a camera

Example of photos used by (Aviezer, 2012) Body language cannot show the different or conflict with what emotion in the mind as reported in (Aviezer, 2012)

 Mostly of researchers using acted scenarios to recognize human emotion, with a few expectations on natural expressions.  Non- acted expression are more complex, detailed, less separable compare to acted expressions.  Non-acted expressions are more challenging compare to acted scenarios (Kleinsmith, 2011).

 120 of subjects involves in this research such as body symmetry, jerk and head movement. Only 1 subject incorrectly classified the emotion. (Glowinski, 2011)

 Subjects are directed to represent of body posture in body extension, arm and upper body position. By using this type of approach, accuracy rate is 96% (Piccardi, 2005)

 Identified the subject while they are interacting with computer based tutor. The experiment was held in the seated environment were used to identify the emotions such as boredom, delight, confusion, frustration and flow. The accuracy for this result is only 40%. From (Graesser, 2009)

 This paper examine the participant emotional states while playing chess and communicate with small robot. By combination of body posture and movement, the result is showed that classification rates is 82%. (G. Castellano, 2009)

Acted Scenarios  The subjects already have enough training how to act according to any emotion. They only act to show the emotion according to their training.  Not complex  Subjects are deliberately show the emotion but body muscle not show the emotion. The data will be mistake Non-Acted Scenarios  The subjects do not have any training about how to show the emotion.  Complex  Give result more accurately to real-life situation