Download presentation
Presentation is loading. Please wait.
Published byOsborn Baldwin Modified over 9 years ago
1
Bob Marinier Advisor: John Laird Functional Contributions of Emotion to Artificial Intelligence
2
Introduction 2 Folk psychology considers emotions to be a distraction from logical thought People tend to think that emotion is unknowable, indefinable Psychological work in the last several decades has demonstrated that emotion plays a critical role in effective functioning and learning
3
Introduction 3 Research Goals Bring the functionality of emotion to AI Create a precise computational definition of emotion Approach Integrate emotion with a complete agent framework Computationally distinguish emotion, mood and feeling Weight feeling’s importance by computing its intensity Use feeling as intrinsic reward signal to drive reinforcement learning
4
Appraisal Theories of Emotion 4 A situation is evaluated along a number of appraisal dimensions, many of which relate the situation to current goals Novelty, goal relevance, goal conduciveness, expectedness, causal agency, etc. Result of appraisals determines emotion The emotion is combined with mood, which is an “average” over recent emotions, to form a feeling, which is actually perceived with some intensity The feeling can then be coped with (via internal or external actions) Situation Goals Appraisal Emotion, Mood, Feeling Coping
5
Appraisals to Emotions (Scherer 2001) 5 JoyFearAnger SuddennessHigh/mediumHigh UnpredictabilityHigh Intrinsic pleasantnessLow Goal/need relevanceHigh Cause: agentOther/natureOther Cause: motiveChance/intentionalIntentional Outcome probabilityVery highHighVery high Discrepancy from expectationHigh ConducivenessVery highLow ControlHigh PowerVery lowHigh Why these dimensions? What is the functional purpose?
6
Functions of Emotion 6 Situation summary: Appraisals and emotion provide abstract interpretation Decouples stimulus/response: Can react to interpretation instead of stimulus Attention: Some appraisals help prioritize processing Historical context: Mood provides a context for current interpretations Learning: Feeling may provide an intrinsic reward signal Memory Decision making Action preparation Communication
7
Outline 7 Integrate emotion with a complete agent framework Computationally distinguish emotion, mood and feeling Weight feeling’s importance by computing its intensity Use feeling as intrinsic reward signal to drive reinforcement learning Discussion & Conclusion Situation Goals Appraisal Emotion, Mood, Feeling Coping
8
Newell’s Abstract Functional Operations (Newell 1990) 8 Allen Newell defined a set of computational Abstract Functional Operations that are necessary and sufficient for immediate behavior in humans and complete agents PerceiveObtain raw perception EncodeCreate domain-independent representation AttendChoose stimulus to process ComprehendGenerate structures that relate stimulus to tasks and can be used to inform behavior TaskPerform task maintenance IntendChoose an action, create prediction DecodeDecompose action into motor commands MotorExecute motor commands
9
Newell’s Abstract Functional Operations (Newell 1990) 9 …but how these actually work was not clear. PerceiveWhat information is generated? EncodeWhat information is generated? AttendWhat information is required? ComprehendWhat information is required and generated? TaskWhat information is required? IntendWhat information is required?
10
NAFO and Appraisal (Marinier & Laird 2006) 10 Generated ByRequired By Suddenness Perceive Attend Unpredictability Encode Intrinsic pleasantness Goal relevance Causal agent Comprehend Comprehend, Task, Intend Causal motive Outcome probability Discrepancy from expectation Goal/need conduciveness Control Power
11
Integrate emotion with a complete agent framework Computationally distinguish emotion, mood and feeling Weight feeling’s importance by computing its intensity Use feeling as intrinsic reward signal to drive reinforcement learning Discussion & Conclusion Outline 11 Situation Goals Appraisal Emotion, Mood, Feeling Coping
12
Body Symbolic Long-Term Memories Procedural Short-Term Memory Situation, Goals Decision Procedure Chunking Reinforcement Learning Semantic Learning Episodic Learning Perception Action Visual Imagery Feeling Generation Extending Soar with Emotion (Marinier & Laird 2007) 12 Soar is a cognitive architecture A cognitive architecture is a set of task-independent mechanisms that interact to give rise to behavior Cognitive architectures are general agent frameworks
13
Feeling Generation Reinforcement Learning Emotion.5,.7,0,-.4,.3,… Extending Soar with Emotion (Marinier & Laird 2007) 13 Body Decision Procedure Perception Action Appraisals Feelings Short-Term Memory Situation, Goals Mood.7,-.2,.8,.3,.6,… Feelings Knowledge Architecture Symbolic Long-Term Memories Procedural Chunking Semantic Learning Episodic Learning +/- Intensity Feeling.9,.6,.5,-.1,.8,… Visual Imagery
14
Computing Feeling from Emotion and Mood (Marinier & Laird 2007) 14 Assumption: Appraisal dimensions are independent Limited Range: Inputs and outputs are in [0,1] or [-1,1] Distinguishability: Very different inputs should lead to very different outputs Non-linear: Linearity would violate limited range and distinguishability
15
Example 15 EmotionMoodFeeling Suddenness [0,1]0.235 Unpredictability [0,1].250.400.419 Intrinsic-pleasantness [-1,1]0-.235 Goal-relevance [0,1].750.222.750 Causal-agent (self) [0,1]000 Causal-agent (other) [0,1]000 Causal-agent (nature) [0,1]1.6601 Causal-motive (intentional) [0,1]000 Causal-motive (chance) [0,1]1.6601 Causal-motive (negligence) [0,1]000 Outcome-probability [0,1].750.516.759 Discrepancy [0,1].250.326.362 Conduciveness [-1,1].500-.269.290 Control [-1,1].500-.141.402 Power [-1,1].500-.141.402 Labelela-joyanx-worela-joy
16
Example 16
17
Maze Task Start Goal 17
18
Feeling Dynamics Results 18 very easy
19
Computing Feeling Intensity (Marinier & Laird 2007) 19 Motivation: Intensity gives a summary of how important (i.e., how good or bad) the situation is Limited range: Should map onto [0,1] No dominant appraisal: No single value should drown out all the others Can’t just multiply values, because if any are 0, then intensity is 0 Realization principle: Expected events should be less intense than unexpected events
20
Example 20 EmotionMoodFeeling Suddenness [0,1]0.235 Unpredictability [0,1].250.400.419 Intrinsic-pleasantness [-1,1]0-.235 Goal-relevance [0,1].750.222.750 Causal-agent (self) [0,1]000 Causal-agent (other) [0,1]000 Causal-agent (nature) [0,1]1.6601 Causal-motive (intentional) [0,1]000 Causal-motive (chance) [0,1]1.6601 Causal-motive (negligence) [0,1]000 Outcome-probability [0,1].750.516.759 Discrepancy [0,1].250.326.362 Conduciveness [-1,1].500-.269.290 Control [-1,1].500-.141.402 Power [-1,1].500-.141.402 Labelela-joyanx-worela-joy Intensity.127
21
Example 21
22
Outline 22 Integrate emotion with a complete agent framework Computationally distinguish emotion, mood and feeling Weight feeling’s importance by computing its intensity Use feeling as intrinsic reward signal to drive reinforcement learning Discussion & Conclusion
23
Intrinsically Motivated Reinforcement Learning (Sutton & Barto 1998; Singh et al. 2004) 23 Environment Critic Agent ActionsStatesRewards External Environment Internal Environment Agent Critic Actions StatesRewardsDecisions Sensations Appraisal Process +/-Feeling Intensity “Organism”
24
Learning Task Start Goal 24
25
Learning Results 25
26
Discussion & Conclusion 26 Discussion Agent learns fast Gets frequent reward signals Mood accelerates learning Provides reward during those steps in which the agent has no emotion Conclusion Developed an initial computational model of emotion Integrated model with complete agent framework Demonstrated some functional advantages of integration
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.