Introduction to Artificial Intelligence – Unit 9 Communication Course 67842 The Hebrew University of Jerusalem School of Engineering and Computer Science.

Slides:



Advertisements
Similar presentations
Natural Language Processing (NLP) Informally: NLP = computers handling ordinary language languages exist. Important differences, more important.
Advertisements

Semantics (Representing Meaning)
CS252: Systems Programming
SEMANTICS.
Chapter Chapter Summary Languages and Grammars Finite-State Machines with Output Finite-State Machines with No Output Language Recognition Turing.
ICE1341 Programming Languages Spring 2005 Lecture #5 Lecture #5 In-Young Ko iko.AT. icu.ac.kr iko.AT. icu.ac.kr Information and Communications University.
ICE1341 Programming Languages Spring 2005 Lecture #4 Lecture #4 In-Young Ko iko.AT. icu.ac.kr iko.AT. icu.ac.kr Information and Communications University.
GRAMMAR & PARSING (Syntactic Analysis) NLP- WEEK 4.
For Monday Read Chapter 23, sections 3-4 Homework –Chapter 23, exercises 1, 6, 14, 19 –Do them in order. Do NOT read ahead.
ISBN Chapter 3 Describing Syntax and Semantics.
6/12/2015Prof. Hilfinger CS164 Lecture 111 Bottom-Up Parsing Lecture (From slides by G. Necula & R. Bodik)
1 Chapter 20 Understanding Language. 2 Chapter 20 Contents (1) l Natural Language Processing l Morphologic Analysis l BNF l Rewrite Rules l Regular Languages.
Chapter 3 Describing Syntax and Semantics Sections 1-3.
CS 330 Programming Languages 09 / 18 / 2007 Instructor: Michael Eckmann.
Chapter 3 Describing Syntax and Semantics Sections 1-3.
Natural Language Query Interface Mostafa Karkache & Bryce Wenninger.
Chapter 3 Describing Syntax and Semantics Sections 1-3.
1 Introduction: syntax and semantics Syntax: a formal description of the structure of programs in a given language. Semantics: a formal description of.
Artificial Intelligence 2004 Natural Language Processing - Syntax and Parsing - Language Syntax Parsing.
Chapter 3: Formal Translation Models
BİL744 Derleyici Gerçekleştirimi (Compiler Design)1.
COP4020 Programming Languages
Models of Generative Grammar Smriti Singh. Generative Grammar  A Generative Grammar is a set of formal rules that can generate an infinite set of sentences.
11 CS 388: Natural Language Processing: Syntactic Parsing Raymond J. Mooney University of Texas at Austin.
Context Free Grammars Reading: Chap 12-13, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Paul Tarau, based on Rada.
1 13. LANGUAGE AND COMPLEXITY 2007 년 11 월 03 일 인공지능연구실 한기덕 Text: Speech and Language Processing Page.477 ~ 498.
Lee CSCE 314 TAMU 1 CSCE 314 Programming Languages Syntactic Analysis Dr. Hyunyoung Lee.
Communication and Language Chap. 22. Outline Communication Grammar Syntactic analysis Problems.
For Wednesday Read chapter 23, sections 1-2 Homework: –Chapter 22, exercises 1, 8, 14.
Inteligenta Artificiala
CCSB354 ARTIFICIAL INTELLIGENCE (AI)
Lecture 12: 22/6/1435 Natural language processing Lecturer/ Kawther Abas 363CS – Artificial Intelligence.
For Monday Read chapter 23, sections 1-2 FOIL exercise due.
For Friday Finish chapter 23 Homework: –Chapter 22, exercise 9.
1 Chapter 3 Describing Syntax and Semantics. 3.1 Introduction Providing a concise yet understandable description of a programming language is difficult.
Grammars CPSC 5135.
1 Context-Free Languages. 2 Regular Languages 3 Context-Free Languages.
Natural Language Processing Artificial Intelligence CMSC February 28, 2002.
1 Logical Agents CS 171/271 (Chapter 7) Some text and images in these slides were drawn from Russel & Norvig’s published material.
Formal Semantics Chapter Twenty-ThreeModern Programming Languages, 2nd ed.1.
Copyright © by Curt Hill Grammar Types The Chomsky Hierarchy BNF and Derivation Trees.
Class Project Due at end of finals week Essentially anything you want, so long as its AI related and I approve Any programming language you want In pairs.
Natural Language Sections What the Speaker Speaks §Intention l S wants H to believe P §Generation l S chooses the words, W to convey the.
Context Free Grammars Reading: Chap 9, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Rada Mihalcea.
11 Chapter 14 Part 1 Statistical Parsing Based on slides by Ray Mooney.
Copyright © Curt Hill Languages and Grammars This is not English Class. But there is a resemblance.
For Wednesday Finish Chapter 22 Program 4 due. Program 4 Any questions?
Parsing Introduction Syntactic Analysis I. Parsing Introduction 2 The Role of the Parser The Syntactic Analyzer, or Parser, is the heart of the front.
For Wednesday Read chapter 23 Homework: –Chapter 22, exercises 1,4, 7, and 14.
Agents That Communicate Chris Bourne Chris Christen Bryan Hryciw.
CSA2050 Introduction to Computational Linguistics Lecture 1 What is Computational Linguistics?
Natural Language - General
1 Logical Agents CS 171/271 (Chapter 7) Some text and images in these slides were drawn from Russel & Norvig’s published material.
ISBN Chapter 3 Describing Syntax and Semantics.
For Friday No reading Program 4 due. Program 4 Any questions?
CS 208: Computing Theory Assoc. Prof. Dr. Brahim Hnich Faculty of Computer Sciences Izmir University of Economics.
Natural Language Processing Slides adapted from Pedro Domingos
CS 330 Programming Languages 09 / 25 / 2007 Instructor: Michael Eckmann.
1 UNIT-3 KNOWLEDGE REPRESENTATION. 2 Agents that reason logically(Logical agents) A Knowledge based Agent The Wumpus world environment Representation,
NATURAL LANGUAGE PROCESSING
Natural Language Processing (NLP)
Formal grammars A formal grammar is a system for defining the syntax of a language by specifying sequences of symbols or sentences that are considered.
Introduction to Artificial Intelligence – Unit 10 Communication Course The Hebrew University of Jerusalem School of Engineering and Computer Science.
CS416 Compiler Design1. 2 Course Information Instructor : Dr. Ilyas Cicekli –Office: EA504, –Phone: , – Course Web.
Chapter 3 – Describing Syntax CSCE 343. Syntax vs. Semantics Syntax: The form or structure of the expressions, statements, and program units. Semantics:
Modeling Arithmetic, Computation, and Languages Mathematical Structures for Computer Science Chapter 8 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesAlgebraic.
Natural Language Processing Vasile Rus
Semantics (Representing Meaning)
Natural Language - General
Natural Language Processing
Presentation transcript:

Introduction to Artificial Intelligence – Unit 9 Communication Course The Hebrew University of Jerusalem School of Engineering and Computer Science Academic Year: 2009/2010 Instructor: Jeff Rosenschein (Chapter 22, “Artificial Intelligence: A Modern Approach”)

Outline  Communication  Grammar  Syntactic Analysis  Problems 2

Communication  “Classical” view (pre-1953):  language consists of sentences that are true/false (similar to logic)  “Modern” view (post-1953):  language is a form of action  Wittgenstein (1953) Philosophical Investigations  Austin (1962) How to Do Things with Words  Searle (1969) Speech Acts Why? To change the actions of other agents. 3

Speech Acts  Speech acts achieve the speaker’s goals:  Inform “There’s a pit in front of you”  Query “Can you see the gold?”  Command “Pick it up”  Promise “I’ll share the gold with you”  Acknowledge “OK”  Speech act planning requires knowledge of  Situation  Semantic and syntactic conventions  Hearer’s goals, knowledge base, and rationality 4

Stages in communication (informing)  IntentionS wants to inform H that P  GenerationS selects words W to express P in context C  Synthesis S utters words W  Perception H perceives W′ in context C′  Analysis H infers possible meanings P 1,…, P n  Disambiguation H infers intended meaning P i  Incorporation H incorporates P i into KB How could this go wrong? 5

Stages in communication (informing)  How could this go wrong?  Insincerity (S doesn’t believe P )  Speech wreck ignition failure  Ambiguous utterance  Di ff ering understanding of current context (C ≠ C′) 6

Grammar  Vervet monkeys, antelopes, etc. use isolated symbols for sentences ⇒ restricted set of communicable propositions, no generative capacity (Chomsky (1957): Syntactic Structures)  Grammar specifies the compositional structure of complex messages, e.g., speech (linear), text (linear), music (two-dimensional)  A formal language is a set of strings of terminal symbols  Each string in the language can be analyzed/generated by the grammar 7

Grammar  The grammar is a set of rewrite rules, e.g.,  S → NP VP  Article → the | a | an |...  Here S is the sentence symbol, NP and VP are nonterminals 8

Grammar Types – Chomsky Hierarchy  Regular: nonterminal → terminal[nonterminal ] Can be accepted by a deterministic finite state machine, and by a read-only Turing machine  S → aS  S → Λ  Context-free: nonterminal → anything Can be accepted by a a pushdown automaton, a finite automaton that can use a stack  S → aSb  Context-sensitive: at least as many symbols on right side; can be accepted by a linear bounded non-deterministic Turing machines  ASB → AAaBB  Recursively enumerable: no constraints; accepted by Turing machine  Natural languages probably context-free; Chomsky theory states that utterances have a syntax which can be characterized by a context-free grammar extended with transformational rules 9

Wumpus Lexicon  Divided into closed and open classes 10

Wumpus Lexicon  Divided into closed and open classes 11

Wumpus grammar 12

Grammaticality judgments  Formal language L 1 may differ from natural language L 2 Adjusting L 1 to agree with L 2 is a learning problem the gold grab the wumpus I smell the wumpus the gold I give the wumpus the gold I donate the wumpus the gold Intersubjective agreement somewhat reliable, independent of semantics. Real grammars pages, insufficient even for “proper” English Adjusting L 1 to agree with L 2 is a learning problem the gold grab the wumpus I smell the wumpus the gold I give the wumpus the gold I donate the wumpus the gold Intersubjective agreement somewhat reliable, independent of semantics. Real grammars pages, insufficient even for “proper” English 13

Parse Trees  Exhibit the grammatical structure of a sentence 14

Parse Trees  Exhibit the grammatical structure of a sentence 15

Parse Trees  Exhibit the grammatical structure of a sentence 16

Parse Trees  Exhibit the grammatical structure of a sentence 17

Parse Trees  Exhibit the grammatical structure of a sentence 18

Syntax in NLP  Most view syntactic structure as an essential step towards meaning; “Mary hit John” ≠ “John hit Mary”  “And since I was not informed --- as a matter of fact, since I did not know that there were excess funds until we, ourselves, in that checkup after the whole thing blew up, and that was, if you’ll remember, that was the incident in which the attorney general came to me and told me that he had seen a memo that indicated that there were no more funds.” 19

Syntax in NLP  “Wouldn’t the sentence ‘I want to put a hyphen between the words Fish and And and And and Chips in my Fish-And-Chips sign’ have been clearer if quotation marks had been placed before Fish, and between Fish and and, and and and And, and And and and, and and and And, and And and and, and and and Chips, as well as after Chips?” 20 …between the words “Fish” and “And” and “And” and “Chips”

Context-free parsing  Bottom-up parsing works by replacing any substring that matches RHS of a rule with the rule’s LHS  Efficient algorithms (e.g., chart parsing, Section 22.3, Russell and Norvig) O(n 3 ) for context-free, run at several thousand words/sec for real grammars  Context-free parsing ≡ Boolean matrix multiplication (Lee, 2002) ⇒ unlikely to find faster practical algorithms 21

Logical grammars  BNF notation for grammars too restrictive:  difficult to add “side conditions” (number agreement, etc.)  difficult to connect syntax to semantics  Idea: express grammar rules as logic Here, X (s) means that string s can be interpreted as an X 22

Logical Grammars, cont.  Now it’s easy to augment the rules 23

Real Language  Real human languages provide many problems for NLP:  ambiguity  anaphora  indexicality  vagueness  discourse structure  metonymy  metaphor  noncompositionality 24

Ambiguity  Beach closing down last year  Squad helps dog bite victim  Helicopter powered by human flies  American pushes bottle up Germans  I ate spaghetti with meatballs  with salad  with abandon  with a fork  with a friend  Ambiguity can be lexical (polysemy), syntactic, semantic, referential 25

Ambiguity resolved in speech  The sentence: “I never said she stole my money.” can have seven different meanings depending on which word is stressed  I never said she stole my money. 26

Anaphora  Using pronouns to refer back to entities already introduced in the text:  After Mary proposed to John, they found a preacher and got married.  For the honeymoon, they went to Hawaii.  Mary saw a ring through the window and asked John for it.  Mary threw a rock at the window and broke it. 27

Indexicality  Indexical sentences refer to utterance situation (place, time, S/H, etc.):  I am over here.  Why did you do that? 28

Metonymy  Using one noun phrase to stand for another:  I’ve read Shakespeare.  Chrysler announced record profits.  The ham sandwich on Table 4 wants another beer. 29

Metaphor  “Non-literal” usage of words and phrases, often systematic:  I’ve tried killing the process but it won’t die. Its parent keeps it alive. 30

Noncompositionality  basketball shoes  baby shoes  alligator shoes  designer shoes  brake shoes  red book  red pen  red hair  red herring  small moon  large molecule  mere child  alleged murderer  real leather  artificial grass 31