Emotion Detection Ryan K Smith.

Slides:



Advertisements
Similar presentations
Chapter 14: Usability testing and field studies. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept.
Advertisements

Stephanie Witte Wisconsin Lutheran College Deborrah Uecker COM 205
Sunee Holland University of South Australia School of Computer and Information Science Supervisor: Dr G Stewart Von Itzstein.
Wang, Z., et al. Presented by: Kayla Henneman October 27, 2014 WHO IS HERE: LOCATION AWARE FACE RECOGNITION.
Assumes that events are governed by some lawful order
The Expression of Emotion: Nonverbal Communication.
Multiple Intelligences Ways to learn. 2 Yesterday, we took a test to determine our “learning style” Yesterday, we took a test to determine our “learning.
The Expression of Emotion: Nonverbal Communication.
Intro to Health Science Chapter 4 Section 3.3
Languaging for Leadership Choosing words that work better.
Self-Awareness as a Tool for Effective and Peaceful Communication.
Baron-Cohen Cognitive Psychology The Core Studies.
Presented By Meet Shah. Goal  Automatically predicting the respondent’s reactions (accept or reject) to offers during face to face negotiation by analyzing.
 Adolescence- state of life between childhood and adulthood, between ages  You will experience physical, mental, emotional, and social changes.
 a person's essential being that distinguishes them from others, especially considered as the object of introspection or reflexive action.
GCSE COMPUTER SCIENCE Practical Programming using Python
Chapter 1: Exploring Data
Schizophrenia: an inside view
Face Detection and Notification System Conclusion and Reference
Lesson by Ryan Benson, M.A.
BRIEF TITLE OF THE POSTER: Subtitle of the Poster
Theories of Emotion 3 Theories of Emotion.
Lecture 7 Constructing hypotheses
Exploring Communication as a Behavioural Process
Part III – Gathering Data
Data Analysis and Standard Setting
RELIABILITY OF QUANTITATIVE & QUALITATIVE RESEARCH TOOLS
Submitted by: Ala Berawi Sujod Makhlof Samah Hanani Supervisor:
Can Computer Algorithms Guess Your Age and Gender?
Facial Affect Recognition in Autism, ADHD and Typical Development
Welcome To My Presentation Of *PERCEPTION*
Interviewing for Employment and Following Up
Learning to Communicate
Final Year Project Presentation --- Magic Paint Face
Expressing and Experiencing Emotion
Communication Disability
Emotion Lesson Objectives
Year 11 Media Mid Year Exam
Who am I? Learning Goals: I am learning to apply self awareness skills and identify factors that affect the development of my self concept.
Chapter 8: Estimating with Confidence
Chapter 1: Exploring Data
Motivation, Emotion, and Stress
Emotional Messages.
flow charts and system diagrams
Chapter 1: Exploring Data
Puberty
Chapter 1: Exploring Data
1.1 Analyzing Categorical Data.
Chapter 1: Exploring Data
Chapter 1: Exploring Data
Chapter 1: Exploring Data
Programming Fundamentals (750113) Ch1. Problem Solving
Chapter 1: Exploring Data
Expressed Emotion Emotions are expressed on the face, by the body, and by the intonation of voice.
Chapter 1: Exploring Data
Programming Fundamentals (750113) Ch1. Problem Solving
42.1 – Describe our ability to communicate nonverbally, and discuss gender differences in this capacity. Expressed Emotion Emotions are expressed on the.
Chapter 1: Exploring Data
Chapter 1: Exploring Data
Chapter 1: Exploring Data
Hidden Aspects of Communication
Autism and Behavior.
Chapter 1: Exploring Data
Chapter 1: Exploring Data
Chapter 1: Exploring Data
Chapter 1: Exploring Data
The Effect of Lineup Structure on Individual Identification
Misc Internal Validity Scenarios External Validity Construct Validity
Using Phonemic Awareness &
Chapter 1: Creating a Program.
Presentation transcript:

Emotion Detection Ryan K Smith

What is Emotion? Emotion Detection is exactly as it sounds. The ability to detect an emotion present on the face and determine what emotion that person/face is exhibiting.

Emotion Detection Disorders Social-emotional agnosia, also known as emotional agnosia or expressive agnosia, is the inability to perceive facial expressions, body language, and voice intonation. A person with this disorder is unable to non-verbally perceive others' emotions in social situations, limiting normal social interactions. The condition causes a functional blindness to subtle non-verbal social-emotional cues in voice, gesture, and facial expression. People with this form of agnosia have difficulty in determining and identifying the motivational and emotional significance of external social events, and may appear emotionless or agnostic (uncertainty or general indecisiveness about a particular thing). Symptoms of this agnosia can vary depending on the area of the brain affected. Social-emotional agnosia often occurs in individuals with schizophrenia and autism.

Emotion Detection Disorders Alexithymia is a personality construct characterized by the subclinical inability to identify and describe emotions in the self. The core characteristics of alexithymia are marked dysfunction in emotional awareness, social attachment, and interpersonal relating. Furthermore, alexithymics have difficulty in distinguishing and appreciating the emotions of others, which is thought to lead to unempathic and ineffective emotional responding.[ Alexithymia is prevalent in approximately 10% of the general population and is known to be comorbid with a number of psychiatric conditions.

Original App Goals I originally set out to create an app that helps with both of these disorders. The original design was to have 3 parts. The first part was a game. The app would present you with a face, and you would guess its emotion. It would tell you the answer and the get you to try and show the same emotion via the camera. It would then give you a score based on how well you displayed that emotion.

Original App Goals The app would also be able to take in pictures and read the emotions to assist its user. Lastly a free form section to help a user see how changing ones face affects the emotion displayed. (OpenCv)

Scope enforced by reality Modified scope: I decided to focus primarily on the free form section of the app, (the open cv section.)

Dataset The Karolinska Institutet Emotion Lab, out of Solna Sweden has compiled a list of 4900 face images. The images are separated into 5 different emotions and by male and female. The Data is in the form of pictures not in the form of pixel arrays compiled into a xlsx or csv. I chose 3 emotions for this project: happy/smiling, sad/frowning, and anger/scowling.

Problems with the Data Since the data is all in photo form they have to be individually scanned into matlab for combination. To save time just for this project, and to make the app more accurate I decided to have the app ask the user their gender and work on just one gender. This reduced the number of pictures I needed to analyse and import into matlab from 2940 to 1470 jpg files.

Gender Selection: Side note: I’m also working on custom buttons to replace the words male and female with the gender symbols used on restroom doors. Since only male data has been processed, if you press female nothing happens.

Matlab problem I wanted to enter all these at once and import them all. But I kept causing screw ups in matlab. This was mostly user error from dealing with such large counts of photos. As such I decided to solve it with a manual divide and conquer. Each emotion had 490 photos. I divided that into 10 groups of 49. And combined the sub groups. Then I took the combined the subgroups into a master photo/array for the photos.

Happy/Smiling

Sad/Frowning

Angry/Snarl

Data Quality Negatives: The photos were compiled in the 90s in Sweden. Not exactly the most culturally diverse sample set. Pretty much all white individuals without facial hair. Positives: All individuals are set against the same background and are wearing the same shirt to work as controls. Also all are facing directly at the camera.

15X15 pixels Happy/Smiling Sad/Frowning Angry/Snarling

Unfinished My app is currently unfinished. I’m having issues with the open CVV detecting the emotions. I did have the ability to test my data sample though by applying the detection to images. Ironically enough, this is similar to app function number 2, helping individual determine emotion in a photo.

Future goals: I want to have all 3 sections up and running. Maybe even have the app keep a high scores for the phone for the game or keep a best score for the day, and record a persons progress over time. Also the inclusions of multiple races will probably help with the accuracy of nonwhite people. Facial hair for men will also be a must.

Conclusion: I would like to have the app achieve an acceptable level of determination. Right now when a photo is compared to the sample data it is right 68.7% of the time. I want it to be right almost as much as humans.

Which brings me to my final part of the presentation… THE QUIZ!!!

Person 1

Person 1 Answer: Happy Computer Answer: Happy

Person 2

Person 2 Answer: Happy Computer Answer: Happy

Person 3

Person 3 Answer: Happy Computer Answer: Happy

Person 4

Person 4 Answer: Angry Computer Answer: Angry

Person 5

Person 5 Answer: Angry Computer Answer: Happy

Person 6

Person 6 Answer: Sad Computer Answer: Sad

Person 7

Person 7 Answer: Sad Computer Answer: Sad

Person 8

Person 8 Answer: Sad Computer Answer: Angry

Person 9

Person 9 Answer: Angry Computer Answer: Angry

Person 10

Person 10 Answer: Happy Computer Answer: Happy

Person 11

Person 11 Answer: Sad Computer Answer: Angry

Person 12

Person 12 Answer: Angry Computer Answer: Happy

Person 13

Person 13 Answer: Happy Computer Answer: Happy

Person 14

Person 14 Answer: Angry Computer Answer: Happy

Person 15

Person 15 Answer: Happy Computer Answer: Sad

Person 16

Person 16 Answer: Sad Computer Answer: Sad

Person 17

Person 17 Answer: Happy Computer Answer: Happy

Person 18

Person 18 Answer: Happy Computer Answer: Angry

Person 19

Person 19 Answer: Angry Computer Answer: Sad

Person 20

Person 20 Answer: Happy Computer Answer: Happy

Person 21

Person 21 Answer: Angry Computer Answer: Sad

Person 22

Person 22 Answer: Happy Computer Answer: Angry

Person 23

Person 23 Answer: Happy Computer Answer: Happy

Person 24

Person 24 Answer: Sad Computer Answer: Sad

Person 25

Person 25 Answer: Sad Computer Answer: Sad

Person 26

Person 26 Answer: Constipated?... I mean angry Computer Answer: Angry

Person 27

Person 27 Answer: Sad Computer Answer: Sad

Results The prediction model guessed 17/27 correctly. That’s a 59.2% accuracy, not too bad considering some of the photos. Note this is lower then the 68.7% when it was comparing against the sample data used for training. How’d you do?