Download presentation
Presentation is loading. Please wait.
Published bySteven Cannon Modified over 8 years ago
1
ASMARUL SHAZILA BINTI ADNAN 812538
2
Word Emotion comes from Latin word, meaning to move out. Human emotion can be recognize from facial expression, speech, electroencephalography (EEG), autonomic nervous system (ANS) and body movement. Emotion detection using body movement still not explored yet, maybe due to lack of formal model for body posture as there are for face such as Facial Action Coding System (FACS) (Kleinsmith, 2011).
4
There are such a relationship exist between facial expression and body movement. Most of papers focus on emotion detection through acted scenarios although people are expresses their emotion though body movement by naturally. A big gap accuracy between acted and non-acted scenarios in human emotion detection using body movement.
5
RQ1 What is relationship between facial expression and body movement? RO1 To determine the relationship between facial expression and body movement
6
RQ2 Why there is a big gap in accuracy between acted and non-acted scenarios in emotion detection using body movement? RO2 To identify the big gap in accuracy between acted and non-acted scenarios in emotion detection using body movement.
7
RQ3 How to implement non- acted body movement in emotion detection? RO3 To identify the way to implement non-acted body movement in emotion detection.
8
Playing Nintendo Wii games
9
Performance Evaluation Recognition Rate Emotion Classification Support Vector Machine Feature Extraction Software EyesWeb Body pre-processing 17 parts of body can be uses for human emotion detection: head, neck, arms, spine, shoulders, forearms, wrists, upper-legs, knees and feet. Input Data The subjects will ask to play Nintendo Wii games about 15 minutes while being record with the motion capture system and by a camera
10
Example of photos used by (Aviezer, 2012) Body language cannot show the different or conflict with what emotion in the mind as reported in (Aviezer, 2012)
11
Mostly of researchers using acted scenarios to recognize human emotion, with a few expectations on natural expressions. Non- acted expression are more complex, detailed, less separable compare to acted expressions. Non-acted expressions are more challenging compare to acted scenarios (Kleinsmith, 2011).
12
120 of subjects involves in this research such as body symmetry, jerk and head movement. Only 1 subject incorrectly classified the emotion. (Glowinski, 2011)
13
Subjects are directed to represent of body posture in body extension, arm and upper body position. By using this type of approach, accuracy rate is 96% (Piccardi, 2005)
14
Identified the subject while they are interacting with computer based tutor. The experiment was held in the seated environment were used to identify the emotions such as boredom, delight, confusion, frustration and flow. The accuracy for this result is only 40%. From (Graesser, 2009)
15
This paper examine the participant emotional states while playing chess and communicate with small robot. By combination of body posture and movement, the result is showed that classification rates is 82%. (G. Castellano, 2009)
16
Acted Scenarios The subjects already have enough training how to act according to any emotion. They only act to show the emotion according to their training. Not complex Subjects are deliberately show the emotion but body muscle not show the emotion. The data will be mistake Non-Acted Scenarios The subjects do not have any training about how to show the emotion. Complex Give result more accurately to real-life situation
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.