Final Project Presentation | CIS3203

Slides:



Advertisements
Similar presentations
Facial expression as an input annotation modality for affective speech-to-speech translation Éva Székely, Zeeshan Ahmed, Ingmar Steiner, Julie Carson-Berndsen.
Advertisements

PHONEXIA Can I have it in writing?. Discuss and share your answers to the following questions: 1.When you have English lessons listening to spoken English,
1 Affective Learning with an EEG Approach Xiaowei Li School of Information Science and Engineering, Lanzhou University, Lanzhou, China
Ch 4: Perceiving Persons Part 1: Sept. 17, Social Perception Get info from people, situations, & behavior – We make quick 1 st impressions of people.
Emotions: Emotions: Hampson, E., van Anders, S. M., & Mullin, L. I. (2006). A female advantage in the recognition of emotional facial expressions: test.
The various types of nonverbal communication are basically forms of communication without words. You might be led into thinking that this form is rather.
Recognizing Emotions in Facial Expressions
Knowledge Science & Engineering Institute, Beijing Normal University, Analyzing Transcripts of Online Asynchronous.
Artificial Intelligence
Facial Feature Detection
GUI: Specifying Complete User Interaction Soft computing Laboratory Yonsei University October 25, 2004.
Chapter 3: Paralanguage Flavors the Verbal Mesage.
Beyond Call Recording: Speech Improves Quality Assurance Larry Mark Chief Technology Officer SER Solutions, Inc.
UOS 1 Ontology Based Personalized Search Zhang Tao The University of Seoul.
Artificial Intelligence By Michelle Witcofsky And Evan Flanagan.
How Solvable Is Intelligence? A brief introduction to AI Dr. Richard Fox Department of Computer Science Northern Kentucky University.
I Robot.
CONTENTS INTRODUCTION TO A.I. WORKING OF A.I. APPLICATIONS OF A.I. CONCLUSIONS ON A.I.
Module 16 Emotion.
Performance Comparison of Speaker and Emotion Recognition
ENTERFACE’08 Multimodal Communication with Robots and Virtual Agents mid-term presentation.
RESEARCH MOTHODOLOGY SZRZ6014 Dr. Farzana Kabir Ahmad Taqiyah Khadijah Ghazali (814537) SENTIMENT ANALYSIS FOR VOICE OF THE CUSTOMER.
Comes from the Latin verb communicare, “to impart,” “to share,” “to make common.” We communicate by agreeing, consciously or unconsciously, to call an.
Research Methodology Proposal Prepared by: Norhasmizawati Ibrahim (813750)
Does the brain compute confidence estimates about decisions?
Opinion spam and Analysis 소프트웨어공학 연구실 G 최효린 1 / 35.
Let’s talk about EQ Emotional Intelligence. Emotions are part of who we are. Emotional intelligence is the ability to recognize emotions in ourselves.
Module: Software Engineering of Web Applications Dr. Samer Odeh Hanna 1.
 ASMARUL SHAZILA BINTI ADNAN  Word Emotion comes from Latin word, meaning to move out.  Human emotion can be recognize from facial expression,
Detection Of Anger In Telephone Speech Using Support Vector Machine and Gaussian Mixture Model Prepared By : Siti Marahaini Binti Mahamood.
Applications · E-learning apps, changing the presentation style of an e-lesson according to the mood of the person. · Psychological health services, to.
Sparse Coding: A Deep Learning using Unlabeled Data for High - Level Representation Dr.G.M.Nasira R. Vidya R. P. Jaia Priyankka.
Introduction to Machine Learning, its potential usage in network area,
VEX IQ Curriculum Smart Machines Lesson 09 Lesson Materials:
Emotion propagation in online communities
Facebook's Plan to Send Thoughts from Brain to Computer
Week 5 - Friday CS 113.
Designing Cross-Language Information Retrieval System using various Techniques of Query Expansion and Indexing for Improved Performance  Hello everyone,
The Relationship between Deep Learning and Brain Function
Fundamentals of Information Systems, Sixth Edition
Deep Learning Amin Sobhani.
Robotics Lesson Objectives
Theories of Emotion 3 Theories of Emotion.
Artificial Intelligence for Speech Recognition
Kenneth Baclawski et. al. PSB /11/7 Sa-Im Shin
Rochester Human-Computer Interaction (ROC HCI),University of Rochester
EMOTIONAL INTELLIGENCE
Ever wanted to Anna Hathaway leave an appealing good morning alarm ?
Communication Skills COMM 101 Lecture#2
Artificial Intelligence with Heart: Improving Customer Experience through Sentiment Analysis.
INITIAL GOAL: Detecting personality based on interaction with Alexa
English Language Development Assessment (ELDA)
Before we get started You will need the following apps downloaded to your mobile device: Microsoft Translator Office Lens  This matches with Engage section.
Teaching Listening Based on Active Learning.
Nonverbal Communication
Retrieval of audio testimonials via voice search
NİŞANTAŞI ÜNİVERSİTESİ
What is blue eyes ? aims on creating computational machines that have perceptual and sensory ability like those of human beings. interactive computer.
Emotional Messages.
1.1.1 Software Evolution.
Pervasive Computing Happening?
Organizational Behavior
Facial Expressions and Pitch
Volume 27, Issue 6, Pages (March 2017)
Presented by: Mónica Domínguez
Emotion.
Copyright © Allyn & Bacon 2006
COMMUNICATION.
Module 16 Emotion.
My Child’s Positive Behaviour Support Plan
Presentation transcript:

Final Project Presentation | CIS3203 Humanoid Robot - Emotion Reading Final Project Presentation | CIS3203 안녕! 내 토픽은 humanoid robot which is the most… part in AI. Today I will talk especially about emotion reading humanoid robot. Eunji Ha

01 02 03 CONTENTS Introduction Principle Evaluation Emotion Reading Robot? 01 Principle How? 02 CONTENTS This is table of contents. Evaluation Think more? 03

01 Introduction Emotion Reading Robot? At first, let me show a video. As you could see in the video, pepper is the latest released emotion reading robot. I will talk about the technique which are using to recognize people’s emotion. And then cover more specific info that can realize emotion recognizing robot and evaluation.

02 Facial Expression(Visual info)-based Speech(Voice info)-based Principle How to read human’s emotion? 02 Facial Expression(Visual info)-based Speech(Voice info)-based User’s music or text mood-based Neural Network-based(BCI)

02 Facial Expression(Visual info)-based Principle How to read human’s emotion? 02 Facial Expression(Visual info)-based

“Emotion Detection & Sentiment Analysis Glassware” Principle How to read human’s emotion? 02 Facial Expression(Visual info)-based Emotient's technology works by detecting subtle pattern changes in a person's face. The software, which can be used to measure emotion in individuals and crowds, can tell if a person is feeling positive or negative overall and zero in on more specific emotions, such as joy, sadness, surprise, anger, fear, disgust, and contempt. “Emotion Detection & Sentiment Analysis Glassware”

Voice Feature Extraction Principle How to read human’s emotion? 02 2. Speech(Voice info)-based Voice Signal Voice Detection Voice Feature Extraction Voice Recognition Emotion Recognition Preprocessing process Data Recognition process

02 2. Speech(Voice info)-based Principle How to read human’s emotion? Software that listens to your voice to assess your mood gives call centre agents a dashboard that shows how the conversation is going Vocal Effort provides a means of discriminating soft, quiet voices from loud, tense ones. Vocal Effort focuses on the frequency components of speech that are independent of volume or energy. In the below figure, distributions of Vocal Effort are plotted for agents in red and customers in green. During the Comcast call (top panel) the analysis indicates significantly higher vocal effort and an overall tenser voice for the agent, as compared to the customer. For one of our typical “good” calls (bottom panel), the agent and customer are both highly similar in terms of their vocal effort. Also, both speakers in the “good” conversation display a vocal effort profile that matches the customer in the Comcast call, who was generally perceived as remaining quite calm during the conversation. 2. turn-taking It is especially important to measure when we begin to start speaking relative to the person we are speaking to. Allowing too much of a gap or silence, after our speaking partner stops talking can make us seem disinterested; however, consistently leaving too short a gap or overlapping the end of our partner’s speech by too much can be perceived as rude. 3. behavior continues throughout the conversation. Had the agent noticed and modified his behavior at any of these moments, there might been a better outcome to the call.

02 3. User’s music or text mood-based Principle tunes lylics How to read human’s emotion? 02 3. User’s music or text mood-based tunes Music is oftentimes referred to as a “language of emotion” and also in text we have our emotional expressions inside. Computer(Robot) analyzes these information to get our feelings. lylics

Identify Emotion Words Principle How to read human’s emotion? 02 3. User’s music or text mood-based Text Tokenization Identify Emotion Words Analysis of Intensity Negation Check Emotion Learning-based Methods Keyword spotting technique – Words are classified into the categories --Whether negation is involved in it or not 2. Unlike keyword-based detection methods, learning-based methods try to detect emotions based on a previously trained classifier, which apply various theories of machine learning such as support vector machines [8] and conditional random fields [9], to determine which emotion category should the input text belongs. 3. The most significant hybrid system so far is the work of Wu, Chuang and Lin [11], that utilizes a rule-based approach to extract semantics related to specific emotions and Chinese lexicon ontology to extract attributes. These semantics and attributes are associated with emotions in the form of emotion association rules. As a result, these emotion association rules, replacing original emotion keywords, serve as the training features of their learning module based on separable mixture models. This method outperforms previous approaches, but categories of emotions are still limited. Hybrid Methods Keyword spotting method

02 4. Neural Network-based (BCI) Principle How to read human’s emotion? 02 4. Neural Network-based (BCI) Working of BCI The brain signal is recorded with the help of electrodes from the user and amplified then that amplified signal is fed to the BCI control where signal processing is done i.e. feature extraction and feature translation. Now after the signal processing device command and operating protocol comes in to the action to drive the device that is a wheel chair in this case a device state feedback is also provided for further adjustment of the movement.

02 Cloud System Principle More technique? Now we know how the robot get human’s emotion. They are called ‘emotion engine’ which makes pepper can communicate with people not only verbally but also emotionally. Each robot can learn from its interactions with humans and that their experiences will be uploaded to an Internet cloud database to be shared with other Peppers. This means Peppers can evolve as a collective intelligence->학습 가능함 의미! Cloud System

♬ 03 Evaluation I feel HAPPY! Can and cannot? So far, previous four technique which the robot recognize human’s emotion are developing all over the world. However, because they are still growing area and there are limited cost, it is hard to combine all of the methods. Those enormous amounts of data which get from various ways will help to read more human’s delicate emotion. Also, although those technique’s evolving quickly, Robot’s acting and speaking are expressions which are already predefined and recorded. Making robot express by themselves with its own intelligence and emotions is still challenging problems.

Friend? Foe? or 03 Evaluation Future task? Also!!! Robot’s EMOTION RECOGNITION.. can give Physical benefits + Emotional communication But maybe can threathen human’s future life. I will finish my presentation with video which will make us think about this issue. Foe?

THANK YOU 915171941 Eunji Ha