EXPRESSIVE INTELLIGENCE STUDIO Lecture 11 CS148/248: Interactive Narrative UC Santa Cruz School of Engineering www.soe.ucsc.edu/classes/cmps148/Winter2009.

Slides:



Advertisements
Similar presentations
WHERE TO NEXT? Using Reading Data. Group Learning Pathways.
Advertisements

Heuristic Search techniques
ARCHITECTURES FOR ARTIFICIAL INTELLIGENCE SYSTEMS
Dana Nau: Lecture slides for Automated Planning Licensed under the Creative Commons Attribution-NonCommercial-ShareAlike License:
Semantics (Representing Meaning)
Lecture 1 CS148/248 UC Santa Cruz School of Engineering January 7, 2009.
Notice & Note Strategies for Close Reading by Beers & Probst
1 Machine Learning: Lecture 7 Instance-Based Learning (IBL) (Based on Chapter 8 of Mitchell T.., Machine Learning, 1997)
Lazy vs. Eager Learning Lazy vs. eager learning
Learning Objectives Explain similarities and differences among algorithms, programs, and heuristic solutions List the five essential properties of an algorithm.
EXPRESSIVE INTELLIGENCE STUDIO Lecture 7 CS148/248: Interactive Narrative UC Santa Cruz School of Engineering
Case-Based Reasoning, 1993, Ch11 Kolodner Adaptation method and Strategies Teacher : Dr. C.S. Ho Student : L.W. Pan No. : M Date : 1/7/2000.
Constraint Logic Programming Ryan Kinworthy. Overview Introduction Logic Programming LP as a constraint programming language Constraint Logic Programming.
Computer ArchitectureFall 2007 © November 14th, 2007 Majd F. Sakr CS-447– Computer Architecture.
EXPRESSIVE INTELLIGENCE STUDIO Lecture 8 CS148/248: Interactive Narrative UC Santa Cruz School of Engineering
EXPRESSIVE INTELLIGENCE STUDIO The Future of Gaming Unfolding the Future of Interactive Storytelling UC Santa Cruz School of Engineering
CSE 830: Design and Theory of Algorithms
Wrapping Up PBL Problems Hal White Dept. of Chemistry and Biochemistry Workshop Wednesday June 28, 2006 about Developed by with who uses Presented on emphasizing.
Lecture 2 CS148/248: Interactive Narrative UC Santa Cruz School of Engineering 09 Jan.
Case-based Reasoning System (CBR)
Knowledge Acquisition CIS 479/579 Bruce R. Maxim UM-Dearborn.
Lecture 1 CS148/248 UC Santa Cruz School of Engineering 3 April 2007.
Models of Human Performance Dr. Chris Baber. 2 Objectives Introduce theory-based models for predicting human performance Introduce competence-based models.
1 An introduction to design patterns Based on material produced by John Vlissides and Douglas C. Schmidt.
31 st October, 2012 CSE-435 Tashwin Kaur Khurana.
“The Secret Life of Walter Mitty” Discussion Record Photo Credit =
SECURITY REQUIREMENT FROM PROBLEM FRAMES PERPECTIVE Fabricio Braz 01/25/08.
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
Minimax Trees: Utility Evaluation, Tree Evaluation, Pruning CPSC 315 – Programming Studio Spring 2008 Project 2, Lecture 2 Adapted from slides of Yoonsuck.
Thinking techniques An overview.
Design patterns. What is a design pattern? Christopher Alexander: «The pattern describes a problem which again and again occurs in the work, as well as.
Becerra-Fernandez, et al. -- Knowledge Management 1/e -- © 2004 Prentice Hall Chapter 9 Using Past History Explicitly as Knowledge: Case-based Reasoning.
Dr. Monira Al-Mohizea MORPHOLOGY & SYNTAX WEEK 11.
Introduction to search Chapter 3. Why study search? §Search is a basis for all AI l search proposed as the basis of intelligence l inference l all learning.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology A Taxonomy of Similarity Mechanisms for Case-Based Reasoning.
Introduction to search Chapter 3. Why study search? §Search is a basis for all AI l search proposed as the basis of intelligence l all learning algorithms,
Question of the Day  On a game show you’re given the choice of three doors: Behind one door is a car; behind the others, goats. After you pick a door,
Course: Software Engineering ©Alessandra RussoUnit 2: States and Operations, slide number 1 States and Operations This unit aims to:  Define: State schemas.
StoryFlow: Tracking the Evolution of Stories IEEE INFOVIS 2013 Shixia Liu, Senior Member, IEEE, Microsoft Research Asia Yingcai Wu, Member, IEEE, Microsoft.
EXPRESSIVE INTELLIGENCE STUDIO Lecture 9 CS148/248: Interactive Narrative UC Santa Cruz School of Engineering
CS 682, AI:Case-Based Reasoning, Prof. Cindy Marling1 Chapter 11: Adaptation Methods and Strategies Adaptation is the process of modifying a close, but.
E.Bertino, L.Matino Object-Oriented Database Systems 1 Chapter 5. Evolution Seoul National University Department of Computer Engineering OOPSLA Lab.
Games Development Game Architecture: Entities CO2301 Games Development 1 Week 22.
Automated Reasoning Early AI explored how to automated several reasoning tasks – these were solved by what we might call weak problem solving methods as.
Narrator and frame story
Joseph Xu Soar Workshop Learning Modal Continuous Models.
CpSc 810: Machine Learning Instance Based Learning.
Problem Reduction So far we have considered search strategies for OR graph. In OR graph, several arcs indicate a variety of ways in which the original.
Session 1 Module 1: Introduction to Data Integrity
The Level 2 Exam What do each of the underlined words mean? Apply knowledge of and make judgements about drama processes and performance in a new.
Fluid Concept Architectures An experimentation platform.
Procedural Design The AI in charge of the game experience.
Algorithms For Solving History Sensitive Cascade in Diffusion Networks Research Proposal Georgi Smilyanov, Maksim Tsikhanovich Advisor Dr Yu Zhang Trinity.
The Last Lecture CS 5010 Program Design Paradigms "Bootcamp" Lesson © Mitchell Wand, This work is licensed under a Creative Commons Attribution-NonCommercial.
The FCPSchool Bus Tracker Johnny P., Kiran S. Age 11 Virginia Click through slides.
Science 801 Robotics The Final Project. Task Your team will create a robot that will complete a number of challenges as it moves through a series of tasks.
CS Machine Learning Instance Based Learning (Adapted from various sources)
Lecture 8-2CS250: Intro to AI/Lisp What do you mean, “What do I mean?” Lecture 8-2 November 18 th, 1999 CS250.
1 Learning Bias & Clustering Louis Oliphant CS based on slides by Burr H. Settles.
Kansas State University Department of Computing and Information Sciences CIS 890: Special Topics in Intelligent Systems Wednesday, November 15, 2000 Cecil.
Lecture 14 Page 1 CS 111 Online File Systems: Naming, Reliability, and Advanced Issues CS 111 On-Line MS Program Operating Systems Peter Reiher.
Dana Nau: Lecture slides for Automated Planning Licensed under the Creative Commons Attribution-NonCommercial-ShareAlike License:
Romeo and Juliet Final Project By: Allia Bekkers, Ally Rupiper, Gabi Echevarria.
Tutoring & Help Systems Deepthi Bollu for CSE495 10/31/2003.
Human Computer Interaction Lecture 21 User Support
Semantics (Representing Meaning)
Case-Based Reasoning System for Bearing Design
Lecture 7: Introduction to Parsing (Syntax Analysis)
Database Systems Instructor Name: Lecture-3.
Minimax Trees: Utility Evaluation, Tree Evaluation, Pruning
Presentation transcript:

EXPRESSIVE INTELLIGENCE STUDIO Lecture 11 CS148/248: Interactive Narrative UC Santa Cruz School of Engineering 18 Feb 2009

EXPRESSIVE INTELLIGENCE STUDIOUC SANTA CRUZ Minstrel: A case-based story generator  Minstrel is an author modeling story generator  Pursues storytelling goals, looking up solutions in its case memory  Minstrel is a model of creativity  High level assumptions about creativity  Creativity is driven by a failure of problem solving  Creativity is an extension of problem solving  New solutions are created by using old knowledge in new ways  Minstrel used in two different domains  Storytelling – the primary domain, generates King Arthur stories  Mechanical design

EXPRESSIVE INTELLIGENCE STUDIOUC SANTA CRUZ Case-based reasoning  Many of you may be familiar with a 0 th order version of case based reasoning – k nearest neighbor  Recall is simple distance metric, often Euclidean  No adaptation, simply lookup (majority and averaging are simple adaptation)  No assessment  Minstrel is an instance of symbolic CBR – much richer than kNN  CBR systems learn by storing new cases

EXPRESSIVE INTELLIGENCE STUDIOUC SANTA CRUZ Problem solving failure in CBR  CBR problem solving failure can occur because:  Fail to find a solution to a similar problem  Fail to adapt solution  Fail domain assessment  Additionally, as a model of creativity, Minstrel fails if it is bored  Boredom occurs if a proposed solution has been seen too many times (used at least twice)  Cases stored in episodic memory – a memory of events and experiences indexed by salient cues (e.g. location, action performed, etc.)  Forms classes of related episodes

EXPRESSIVE INTELLIGENCE STUDIOUC SANTA CRUZ Challenges of creativity  Need to modify CBR to account for creative problem solving  New solutions require knowledge not indexed under current problem  Creativity begins when problem solving fails – in CBR, this means that no solutions are found for current problem, or adaptation runs into a dead end  Creativity involves recasting the problem  If trying to solve the current problem is failing, try solving a related problem (search problem space, not just the solution space)  Adaptation must avoid too much complexity  Solving arbitrarily different problems, then trying to adapt that to the original problem, won’t work (limits on creativity)

EXPRESSIVE INTELLIGENCE STUDIOUC SANTA CRUZ Example problem space

EXPRESSIVE INTELLIGENCE STUDIOUC SANTA CRUZ TRAMs – reifying problem space search  TRAM – Transform Recall Adapt Method.  A bundled knowledge representation that knows how to:  Transform a problem description into a related description  Recall a solution to the new problem  Adapt the solution back to the original problem  TRAM:Cross-Domain-Solution  Transform strategy: find a new problem domain with similar actions and actors, and analogically map current problem to new domain  Adapt strategy: Map any discovered solutions back to the original domain by reversing analogical mapping

EXPRESSIVE INTELLIGENCE STUDIOUC SANTA CRUZ Example of TRAM:Cross-Domain-Solution  Domains the system knows about  Knights and princesses (King Arthur)  Businessmen and friends  Original Problem: create a scene in which a knight accidentally meets a princess  Transformed Problem: create a scene in which a businessman accidentally meets somebody  Recalled scene: “John was sitting at home one evening when his dog began acting strange. The dog was scratching at the door and whining for a walk. Finally, John decided to take the dog for a walk. While they were out, John ran across his old friend Pete, whom he hadn’t seen in many years.”  Adapted solution: “One day while out riding, Lancelot’s horse went into the woods. Lancelot could not control his horse. The horse took him deeper into the woods. The horse stopped. Lancelot saw Andrea, a Lady of the Court, who was picking berries.”

EXPRESSIVE INTELLIGENCE STUDIOUC SANTA CRUZ Problem solving with TRAMs

EXPRESSIVE INTELLIGENCE STUDIOUC SANTA CRUZ Imaginative memory: recursive TRAM application  Leaps in creativity result from combinations of small modifications

EXPRESSIVE INTELLIGENCE STUDIOUC SANTA CRUZ Inventing suicide: example TRAM application  Episodic memory contains two episodes Storytelling goal to solve Knight Fight A knight fights a troll with his sword, killing the troll and being injured in the process. The Princess and the Potion A lady of the court drank a potion to make herself ill.

EXPRESSIVE INTELLIGENCE STUDIOUC SANTA CRUZ Application of TRAM:Generalize-Constraint  TRAM:Generalize-Constraint  Transform: select and generalize a feature (call it $generalized-feature) of the scene specification. Use this new scene specification as an index for imaginative recall.  Adapt: adapt the recalled solution to the current problem by adding $generalized feature back to the recalled scene

EXPRESSIVE INTELLIGENCE STUDIOUC SANTA CRUZ Generalizing features using class hierarchy  TRAM:Generalize-Constraint tries to use most specific generalization that recalls a solution  If we completely removed the constraint, can have creativity failures  Example: solve “A knight gives a princess something to make her happy.”  Transform: “A knight gives somebody something to make them happy.”  Recall: “A knight gives a troll a hunk of meat to make it happy.”  Adapt: “A knight gives a princess a hunk of meat to make her happy.”

EXPRESSIVE INTELLIGENCE STUDIOUC SANTA CRUZ Example using recursive TRAMs  TRAM:Similar-Outcomes-Partial-Change  Transform: Replace the resulting state caused by an action with a related state.  Adapt: Replace state change in recalled episode with original state.  Apply TRAM:SOPC to “A knight purposefully kills himself” to create “A knight purposefully injures himself”  Recursively apply TRAM:GC to “A knight purposefully injures himself” to create “Someone purposefully injures themselves”  Recall: “A lady of the court made herself ill by drinking poison”  Adapt (using TRAM:GC) to “A knight made himself ill by drinking poison”  Adapt (using TRAM:SOPC) to “A knight killed himself by drinking poison”

EXPRESSIVE INTELLIGENCE STUDIOUC SANTA CRUZ Randomness  In AI models of creativity, we want to minimize the effect of randomness  Random is not the same as creative  In Minstrel, random decisions are made only when  Multiple episodes are recalled from a single query  Multiple author plans or TRAMs are applicable  These random decisions are highly constrained by the problem solving process  There’s a lot of process on top of the randomness  Compare with the use of randomness in  Context free grammars  Mad Libs

EXPRESSIVE INTELLIGENCE STUDIOUC SANTA CRUZ Minstrel is driven by author-level goals  The preceding slides described the solving of character-level goals  The character goals were set by an author to serve the story, but aren’t themselves directly goals about the form and content of the overall story  Minstrels author goals  Thematic goals – the stories illustrate a theme, in Minstrel’s case, Planning Advice Themes (e.g. “A bird in the hand is worth two in the bush.”)  Drama goals – goals regarding the unity of action (tragedy, foreshadowing)  Consistency goals – motivate and explain story actions  Presentation goals – goals about which events must be fully described, and which can be summarized or omitted (in general diegetic goals)

EXPRESSIVE INTELLIGENCE STUDIOUC SANTA CRUZ Author-level processes  Author planning is broken into two processes  Goal management  Problem solving  Author level problem solving is accomplished exactly the same way as character-level problem solving

EXPRESSIVE INTELLIGENCE STUDIOUC SANTA CRUZ Representations of goals and plans  Character and author-level goals use the same representation  Character and author-level plans are represented differently  Character-level plans are represented using explicit declarative schema (we’ve seen examples earlier in the lecture)  Author-level plans are represented using Lisp  Avoids having to invent yet another representation of computation  But makes author-level plans a black box, hugely limiting creative problem solving

EXPRESSIVE INTELLIGENCE STUDIOUC SANTA CRUZ Example author plan  Name: ALP:Make-Consistent-P-Health  Goal Type: &Check-Consistency  Object: *AL-OBJ*  Test: *AL-OBJ* is a state schema that represents a character being wounded, and it is not motivating a goal by that character to protect his health  Body: 1. Create &P-Health (health protection) goal for the character being wounded in *AL-OBJ* 2. Add a new goal to the story by creating a &Thwarts link from *AL-OBJ* to the new goal 3. Create author-level goals to make sure the new goal is consistent and fully instantiated

EXPRESSIVE INTELLIGENCE STUDIOUC SANTA CRUZ Nonadaptive creativity  Since author-level plans can’t be directly adapted, the system author (meta-author) provides generalized default versions of author plans that are guaranteed to work  Example: the generalized ALP:Default-Consistent plan says that, for whatever object it’s ensuring consistency for, the object (e.g. state, goal) is consistent if it has a type  There’s a special TRAM that recalls default author plans if the standard recall TRAM doesn’t work  See page 85 for a snippet of the index tree for author plans

EXPRESSIVE INTELLIGENCE STUDIOUC SANTA CRUZ Minstrel conclusion  Minstrel provides a case-based model of creativity, with specific support for storytelling via the author-level goals and plans  Imaginative memory (recursive TRAM application) is used to instantiate story episodes that meet requirements set by author-level plans  Author-level problem solving uses the same TRAM mechanism as character-level problem solving, though creative transformation is greatly limited by the black box nature of author plans  Authoring within the Minstrel architecture means specifying  An ontology for character actions (what are the types of states, objects and character goals in the world)  Initial character episodes (stories) known to the system  TRAMS (these are relatively story domain independent)  Author goals and plans (these are relatively story domain independent)