CS147 - Terry Winograd - 1 Lecture 14 – Agents and Natural Language Terry Winograd CS147 - Introduction to Human-Computer Interaction Design Computer Science.

Slides:



Advertisements
Similar presentations
DEBATES Mey.
Advertisements

Mr. Armstrong’s AMAZING Mind Reading PowerPoint! Block 5 Even.
1 MODULE 2 Meaning and discourse in English COOPERATION, POLITENESS AND FACE Lecture 14.
John Coleman.  The title  The topics  Something different – a new framework  The burning questions  Where next?
G.A.M.E.G.A.M.E. This is the way that WE are going to run through the entire YEAR!
AI Defined Textbook definition:
CS10 The Beauty and Joy of Computing Artificial Intelligence Anna Rafferty (Slides adapted from Dan Garcia) 19 March 2012.
Artificial Intelligence. AIM Turing –The Turing Machine (a universal computing machine) –Bletchley (Bombe) –The Turing test for AI –Arrested for homosexuality,
Using Rules Chapter 6. 2 Logic Programming (Definite) logic program: A  B 1, B 2, …, B m program clause (Horn) head body  A 1, A 2, …, A n goal clause.
Leksička semantika i pragmatika 6. predavanje. Headlines Police Begin Campaign To Run Down Jaywalkers Iraqi Head Seeks Arms Teacher Strikes Idle Kids.
Professionals need and have the right to be prepared Without a positive culture very little learning takes place The “Connection” is the key to learning.
1 Chapter 3 Knowledge Representation AI & ESChapter 3 2 Knowledge Representation Semantic Network Network Representation Conceptual Graph Frame.
Interactive Artifacts. Shared Understanding & Mutual Intelligibility Defines the field of social studies – Interpreting the actions of others – Goal is.
Mrs. Chapman. Tabs (Block Categories) Commands Available to use Script Area where you type your code Sprite Stage All sprites in this project.
Ambient Computational Environments Sprint Research Symposium March 8-9, 2000 Professor Gary J. Minden The University of Kansas Electrical Engineering and.
CS147 - Terry Winograd - 1 Lecture 6 – Usability Terry Winograd CS147 - Introduction to Human-Computer Interaction Design Computer Science Department Stanford.
Conversational Agent 1.Two layers: Dialogue manager and Conversational agent. 2.Rule-Based Translator (ELIZA and PARRY) 3. Layer one: Dialogue Manager.
Communication and Theatre 310 Organizational Communication: The Nature of Communication.
CS 4705 Regular Expressions and Automata in Natural Language Analysis CS 4705 Julia Hirschberg.
CS120: Lecture 17 MP Johnson Hunter
1 4 questions (Revisited) What are our underlying assumptions about intelligence? What kinds of techniques will be useful for solving AI problems? At what.
A Conversational Agent to Navigate in Virtual Worlds CHI 2000 Workshop on Natural Language Interfaces The Hague, The Netherlands Pierre Nugues, Christophe.
CS147 - Terry Winograd - 1 Lecture 16 – Affect Terry Winograd CS147 - Introduction to Human-Computer Interaction Design Computer Science Department Stanford.
Introduction to Artificial Intelligence
Mr. Wortzman. Tabs (Block Categories) Available Blocks Script Area Sprite Stage All sprites in this project.
Dengktof Lpesnamtim Trogmdsxz Cemgopf Mencap logo.
Intelligence & Artificial Intelligence You must have a pre-prepared sentence or two to spout about what is a description of intelligence.. And what is.
Artificial Intelligence By John Debovis & Keith Bright.
19/13/2015CS360 AI & Robotics CS360: AI & Robotics TTh 9:25 am - 10:40 am Shereen Khoja
Turing Test and other amusements. Read this! The Actual Article by Turing.
April 2008Historical Perspectives on NLP1 Historical Perspectives on Natural Language Processing Mike Rosner Dept Artificial Intelligence
Grad students vs. Mentors developed by R. Craft, based on student & faculty input Psychology Department Washington State University + material adapted.
Chapter 2. Regular Expressions and Automata From: Chapter 2 of An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition,
1 Human-Computer Interaction Web Interface & Natural Language.
Have you ever had to correct a student? What happened? How did you feel? What did you expect from the student? Do you remember when you were corrected.
THE MEDIA EQUATION Reeves and Nass, 2003 Chapter 1.
AI History, Philosophical Foundations Part 2. Some highlights from early history of AI Gödel’s theorem: 1930 Turing machines: 1936 McCulloch and Pitts.
Exactly what you ordered. Terry created a key to change her husband’s personality. She thought she was doing the best for both of them, but it might open.
A Procedural Model of Language Understanding Terry Winograd in Schank and Colby, eds., Computer Models of Thought and Language, Freeman, 1973 발표자 : 소길자.
CS 100Lecture 281 CS100J Lecture 27 n Previous Lecture –Interfaces –Comparable –Reflection –super –Re-Reading: n Lewis & Loftus, Section 5.5 n Savitch,
The Problem of Knowledge 2 Pages Table of Contents Certainty p – Radical doubt p Radical doubt Relativism p Relativism What should.
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering Programs with Common Sense Mingzhe Du and Hongying Du April, 2011 This paper.
Introduction to CL & NLP CMSC April 1, 2003.
CS 3724 Introduction to Human Computer Interaction Section 2 CRN MW 2:30-3: McB.
CS147 - Terry Winograd - 1 Lecture 4 – Models and Metaphors Terry Winograd CS147 - Introduction to Human-Computer Interaction Design Computer Science Department.
1 CS 2710, ISSP 2610 Foundations of Artificial Intelligence introduction.
Don't Give Up By: Tirsa Castillo. My Name is Tirsa Castillo, I am from the Dominican Republic.I came to the United State on May,03,2007 just about when.
1 Natural Language Processing Lecture Notes 14 Chapter 19.
BOOKS THAT MADE A DIFFERENCE TO… Amanda Blaha. Can reading really make a difference? I have never really been big into reading. But, once I find a book.
Presented by Ronni Rosewicz.  To learn the basics of Social Thinking  To learn practical strategies and common vocabulary to help your child be more.
Agents that Reduce Work and Information Overload and Beyond Intelligent Interfaces Presented by Maulik Oza Department of Information and Computer Science.
Descates Meditations II A starting point for reconstructing the world.
Language: Barrier and Bridge (Chapter five)
Positive Communication: Defusing Challenging Situations
Sight Words.
Philosophy 1050: Introduction to Philosophy Week 13: AI in the Real World and Review.
DIGITAL ETIQUETTE Teachers Students Quiz What is Digital Etiquette? What is Digital Etiquette? Social Networking Social Networking Cyber Bullying Cyber.
CAPE COD LANGUAGE SCHOOl. CAPE COD LANGUAGE SCHOOL Actually.
Chapter 6 Representing Knowledge Using Rules Artificial Intelligence ดร. วิภาดา เวทย์ประสิทธิ์ ภาควิชาวิทยาการคอมพิวเตอร์ คณะ วิทยาศาสตร์ มหาวิทยาลัยสงขลานครินทร์
Christian Citizenship in a Digital World Lesson 4:Digital Etiquette.
Introduction to Artificial Intelligence
From Micro-Worlds to Knowledge Representation : AI at an Impasse Hubert L. Dreyfus 15. Oct Presented by BoYun Eom.
Artificial Intelligence
Agents & Agency What do we mean by agents? Are agents just a metaphor?
Today’s Lesson 2/13/17: Overview
Symbolic cognitive architectures
Learning in Worlds with Objects
A User study on Conversational Software
INFORMATION RETRIEVAL
Presentation transcript:

CS147 - Terry Winograd - 1 Lecture 14 – Agents and Natural Language Terry Winograd CS147 - Introduction to Human-Computer Interaction Design Computer Science Department Stanford University Autumn 2006

A+ A- A B+ C B B- D C+

CS147 - Terry Winograd - 3 Learning Goals What is the agent vision? Why is it hard? What are the promises and dangers?

CS147 - Terry Winograd - 4 Vision: Knowledge Navigator

CS147 - Terry Winograd - 5 Interface Agents - Negroponte The “well trained English butler” –Direct manipulation vs. delegation –Metaphor of “agents” such as real estate agent, travel agent,.. Shared context/experience –The issue is not I.Q.. It is shared knowledge and the practice of using it in your best interests. –Machine learning to develop context –…filter and extract relevant information from a body of information that is orders of magnitude larger than anything I can digest. The power of multiple independent agents

CS147 - Terry Winograd - 6 Natural Language Ad Hoc response (Eliza, 1966) Symbolic understanding (SHRDLU, 1970) Statistical recognition (current systems)

CS147 - Terry Winograd - 7 Eliza (Weizenbaum, 1966) – The Doctor is In TELL ME YOUR PROBLEM Men are all alike. IN WHAT WAY? They're always bugging us about something or other. CAN YOU THINK OF A SPECIFIC EXAMPLE? Well, my boyfriend made me come here. YOUR BOYFRIEND MADE YOU COME HERE He says I'm depressed much of the time. I AM SORRY TO HEAR YOU ARE DEPRESSED It's true. I am unhappy DO YOU THINK COMING HERE WILL HELP YOU NOT TO BE UNHAPPY I need some help, that much seems certain. WHAT WOULD IT MEAN TO YOU IF YOU GOT SOME HELP Perhaps I could learn to get along with my mother. TELL ME MORE ABOUT YOUR FAMILY My mother takes care of me.

CS147 - Terry Winograd - 8 Eliza continued WHO ELSE IN YOUR FAMILY TAKES CARE OF YOU My father. YOUR FATHER You are like my father in some ways. WHAT RESEMBLANCE DO YOU SEE You are not very aggressive but I think you don't want me to notice that. WHAT MAKES YOU THINK I AM NOT VERY AGGRESSIVE You don't argue with me. WHY DO YOU THINK I DON'T ARGUE WITH YOU You are afraid of me. DOES IT PLEASE YOU TO BELIEVE I AM AFRAID OF YOU My father is afraid of everybody. WHAT ELSE COMES TO YOUR MIND WHEN YOU THINK OF YOUR FATHER Bullies. DOES THAT HAVE ANYTHING TO DO WITH THE FACT THAT YOUR BOYFRIEND MADE YOU COME HERE

CS147 - Terry Winograd - 9 ELIZA script key: if 3 decomp: * if * reasmb: Do you think its likely that (2) ? reasmb: Do you wish that (2) ? reasmb: What do you know about (2) ? reasmb: Really, if (2) ? key: dreamed 4 decomp: * i dreamed * reasmb: Really, (2) ? reasmb: Have you ever fantasized (2) while you were awake ? reasmb: Have you ever dreamed (2) before ?

CS147 - Terry Winograd - 10 Natural Language Understanding SHRDLU (Winograd, 1970)

CS147 - Terry Winograd - 11 SHRDLU Dialog pick up a big red block. OK. (does it) grasp the pyramid. I DON'T UNDERSTAND WHICH PYRAMID YOU MEAN. find a block which is taller than the one you are holding and put it into the box. BY "IT", I ASSUME YOU MEAN THE BLOCK WHICH IS TALLER THAN THE ONE I AM HOLDING. OK. (does it) what does the box contain? THE BLUE PYRAMID AND THE BLUE BLOCK. what is the pyramid supported by? THE BOX. how many blocks are not in the box? FOUR OF THEM. is at least one of them narrower than the one which i told you to pick up? YES, THE RED CUBE.

CS147 - Terry Winograd - 12 SHRDLU continued will you please stack up both of the red blocks and either a green cube or a pyramid? OK. (does it) which cube is sitting on the table? THE LARGE GREEN ONE WHICH SUPPORTS THE RED PYRAMID. is there a large block behind a pyramid? YES, THREE OF THEM: A LARGE RED ONE, A LARGE GREEN CUBE, AND THE BLUE ONE. is there anything which is bigger than every pyramid but is not as wide as the thing that supports it? YES, THE BLUE BLOCK. does a steeple SORRY, I DON'T KNOW THE WORD "STEEPLE". a "steeple" is a stack which contains two green cubes and a pyramid. I UNDERSTAND. are there any steeples now? NO. build one.

CS147 - Terry Winograd - 13 Difficulties in Computer Language Understanding Multiplicity of mappings from level to level –Ambiguity (multiple senses), polysemy, homonymy, etc. Context dependence (e.g., pronouns) Subtle complexities of rules Ill-formedness of “natural” natural language –False starts, ungrammaticality, wrong words Difficulty of formalizing imprecise meanings –Metaphor, vagueness, indirect speech acts Pervasive use of world knowledge in cooperative communication –The common sense problem

CS147 - Terry Winograd - 14 Voice/Phone systems Limited domain Statistical recognition Shaping the response Social behavior

CS147 - Terry Winograd - 15 Agents in the User Interface Believable agents – Metaphors with character –Virtual Characters –Microsoft Bob, Microsoft Agents –Conversational agents

CS147 - Terry Winograd - 16 Microsoft Bob

CS147 - Terry Winograd - 17 Anthropomorphism and The Media Equation Byron Reeves and Clifford Nass, The Media Equation: How People Treat Computers, Television, and New Media Like Real People and Places, CSLI, What triggers human-like responses? –Looks –Language How does it affect the user? –Inappropriate attributions (e.g. Eliza) –False expectations (assumed intelligence) –Affective responses (e.g., politeness, flattery) –Uncomfortableness (the “uncanny valley”)

The Uncanny Valley

CS147 - Terry Winograd - 19 Issues for Agent Design [Norman] Ensuring that people feel in control Hiding complexity while revealing underlying operations Promoting accurate expectations and minimizing false hopes Providing built-in safeguards Addressing privacy concerns Developing appropriate forms of human-agent interaction