Presentation is loading. Please wait.

Presentation is loading. Please wait.

Emotion Detection Ryan K Smith.

Similar presentations


Presentation on theme: "Emotion Detection Ryan K Smith."— Presentation transcript:

1 Emotion Detection Ryan K Smith

2 What is Emotion? Emotion Detection is exactly as it sounds. The ability to detect an emotion present on the face and determine what emotion that person/face is exhibiting.

3 Emotion Detection Disorders
Social-emotional agnosia, also known as emotional agnosia or expressive agnosia, is the inability to perceive facial expressions, body language, and voice intonation. A person with this disorder is unable to non-verbally perceive others' emotions in social situations, limiting normal social interactions. The condition causes a functional blindness to subtle non-verbal social-emotional cues in voice, gesture, and facial expression. People with this form of agnosia have difficulty in determining and identifying the motivational and emotional significance of external social events, and may appear emotionless or agnostic (uncertainty or general indecisiveness about a particular thing). Symptoms of this agnosia can vary depending on the area of the brain affected. Social-emotional agnosia often occurs in individuals with schizophrenia and autism.

4 Emotion Detection Disorders
Alexithymia is a personality construct characterized by the subclinical inability to identify and describe emotions in the self. The core characteristics of alexithymia are marked dysfunction in emotional awareness, social attachment, and interpersonal relating. Furthermore, alexithymics have difficulty in distinguishing and appreciating the emotions of others, which is thought to lead to unempathic and ineffective emotional responding.[ Alexithymia is prevalent in approximately 10% of the general population and is known to be comorbid with a number of psychiatric conditions.

5 Original App Goals I originally set out to create an app that helps with both of these disorders. The original design was to have 3 parts. The first part was a game. The app would present you with a face, and you would guess its emotion. It would tell you the answer and the get you to try and show the same emotion via the camera. It would then give you a score based on how well you displayed that emotion.

6 Original App Goals The app would also be able to take in pictures and read the emotions to assist its user. Lastly a free form section to help a user see how changing ones face affects the emotion displayed. (OpenCv)

7 Scope enforced by reality
Modified scope: I decided to focus primarily on the free form section of the app, (the open cv section.)

8 Dataset The Karolinska Institutet Emotion Lab, out of Solna Sweden has compiled a list of 4900 face images. The images are separated into 5 different emotions and by male and female. The Data is in the form of pictures not in the form of pixel arrays compiled into a xlsx or csv. I chose 3 emotions for this project: happy/smiling, sad/frowning, and anger/scowling.

9 Problems with the Data Since the data is all in photo form they have to be individually scanned into matlab for combination. To save time just for this project, and to make the app more accurate I decided to have the app ask the user their gender and work on just one gender. This reduced the number of pictures I needed to analyse and import into matlab from 2940 to 1470 jpg files.

10 Gender Selection: Side note: I’m also working on custom
buttons to replace the words male and female with the gender symbols used on restroom doors. Since only male data has been processed, if you press female nothing happens.

11 Matlab problem I wanted to enter all these at once and import them all. But I kept causing screw ups in matlab. This was mostly user error from dealing with such large counts of photos. As such I decided to solve it with a manual divide and conquer. Each emotion had 490 photos. I divided that into 10 groups of 49. And combined the sub groups. Then I took the combined the subgroups into a master photo/array for the photos.

12 Happy/Smiling

13 Sad/Frowning

14 Angry/Snarl

15 Data Quality Negatives: The photos were compiled in the 90s in Sweden. Not exactly the most culturally diverse sample set. Pretty much all white individuals without facial hair. Positives: All individuals are set against the same background and are wearing the same shirt to work as controls. Also all are facing directly at the camera.

16 15X15 pixels Happy/Smiling Sad/Frowning Angry/Snarling

17 Unfinished My app is currently unfinished. I’m having issues with the open CVV detecting the emotions. I did have the ability to test my data sample though by applying the detection to images. Ironically enough, this is similar to app function number 2, helping individual determine emotion in a photo.

18 Future goals: I want to have all 3 sections up and running. Maybe even have the app keep a high scores for the phone for the game or keep a best score for the day, and record a persons progress over time. Also the inclusions of multiple races will probably help with the accuracy of nonwhite people. Facial hair for men will also be a must.

19 Conclusion: I would like to have the app achieve an acceptable level of determination. Right now when a photo is compared to the sample data it is right 68.7% of the time. I want it to be right almost as much as humans.

20 Which brings me to my final part of the presentation…
THE QUIZ!!!

21 Person 1

22 Person 1 Answer: Happy Computer Answer: Happy

23 Person 2

24 Person 2 Answer: Happy Computer Answer: Happy

25 Person 3

26 Person 3 Answer: Happy Computer Answer: Happy

27 Person 4

28 Person 4 Answer: Angry Computer Answer: Angry

29 Person 5

30 Person 5 Answer: Angry Computer Answer: Happy

31 Person 6

32 Person 6 Answer: Sad Computer Answer: Sad

33 Person 7

34 Person 7 Answer: Sad Computer Answer: Sad

35 Person 8

36 Person 8 Answer: Sad Computer Answer: Angry

37 Person 9

38 Person 9 Answer: Angry Computer Answer: Angry

39 Person 10

40 Person 10 Answer: Happy Computer Answer: Happy

41 Person 11

42 Person 11 Answer: Sad Computer Answer: Angry

43 Person 12

44 Person 12 Answer: Angry Computer Answer: Happy

45 Person 13

46 Person 13 Answer: Happy Computer Answer: Happy

47 Person 14

48 Person 14 Answer: Angry Computer Answer: Happy

49 Person 15

50 Person 15 Answer: Happy Computer Answer: Sad

51 Person 16

52 Person 16 Answer: Sad Computer Answer: Sad

53 Person 17

54 Person 17 Answer: Happy Computer Answer: Happy

55 Person 18

56 Person 18 Answer: Happy Computer Answer: Angry

57 Person 19

58 Person 19 Answer: Angry Computer Answer: Sad

59 Person 20

60 Person 20 Answer: Happy Computer Answer: Happy

61 Person 21

62 Person 21 Answer: Angry Computer Answer: Sad

63 Person 22

64 Person 22 Answer: Happy Computer Answer: Angry

65 Person 23

66 Person 23 Answer: Happy Computer Answer: Happy

67 Person 24

68 Person 24 Answer: Sad Computer Answer: Sad

69 Person 25

70 Person 25 Answer: Sad Computer Answer: Sad

71 Person 26

72 Person 26 Answer: Constipated?... I mean angry Computer Answer: Angry

73 Person 27

74 Person 27 Answer: Sad Computer Answer: Sad

75 Results The prediction model guessed 17/27 correctly. That’s a 59.2% accuracy, not too bad considering some of the photos. Note this is lower then the 68.7% when it was comparing against the sample data used for training. How’d you do?


Download ppt "Emotion Detection Ryan K Smith."

Similar presentations


Ads by Google