Communication and Language Chap. 22. Outline Communication Grammar Syntactic analysis Problems.

Slides:



Advertisements
Similar presentations
Natural Language Processing (NLP) Informally: NLP = computers handling ordinary language languages exist. Important differences, more important.
Advertisements

Semantics (Representing Meaning)
Agents That Reason Logically Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 7 Spring 2004.
Propositional Logic Reading: C , C Logic: Outline Propositional Logic Inference in Propositional Logic First-order logic.
Syntactic analysis using Context Free Grammars. Analysis of language Morphological analysis – Chairs, Part Of Speech (POS) tagging – The/DT man/NN left/VBD.
Grammars, constituency and order A grammar describes the legal strings of a language in terms of constituency and order. For example, a grammar for a fragment.
Statistical NLP: Lecture 3
Chapter Chapter Summary Languages and Grammars Finite-State Machines with Output Finite-State Machines with No Output Language Recognition Turing.
For Monday Read Chapter 23, sections 3-4 Homework –Chapter 23, exercises 1, 6, 14, 19 –Do them in order. Do NOT read ahead.
Class Project Due at end of finals week Essentially anything you want, so long as its AI related and I approve Any programming language you want In pairs.
1 Chapter 20 Understanding Language. 2 Chapter 20 Contents (1) l Natural Language Processing l Morphologic Analysis l BNF l Rewrite Rules l Regular Languages.
Logical Agents Chapter 7. Why Do We Need Logic? Problem-solving agents were very inflexible: hard code every possible state. Search is almost always exponential.
Logical Agents Chapter 7. Why Do We Need Logic? Problem-solving agents were very inflexible: hard code every possible state. Search is almost always exponential.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Fall 2005-Lecture 2.
Natural Language Query Interface Mostafa Karkache & Bryce Wenninger.
Logical Agents Chapter 7 Feb 26, Knowledge and Reasoning Knowledge of action outcome enables problem solving –a reflex agent can only find way from.
1 CONTEXT-FREE GRAMMARS. NLE 2 Syntactic analysis (Parsing) S NPVP ATNNSVBD NP AT NNthechildrenate thecake.
Artificial Intelligence 2004 Natural Language Processing - Syntax and Parsing - Language Syntax Parsing.
Chapter 3: Formal Translation Models
Agents that Reason Logically Logical agents have knowledge base, from which they draw conclusions TELL: provide new facts to agent ASK: decide on appropriate.
Models of Generative Grammar Smriti Singh. Generative Grammar  A Generative Grammar is a set of formal rules that can generate an infinite set of sentences.
11 CS 388: Natural Language Processing: Syntactic Parsing Raymond J. Mooney University of Texas at Austin.
Context Free Grammars Reading: Chap 12-13, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Paul Tarau, based on Rada.
For Wednesday Read chapter 23, sections 1-2 Homework: –Chapter 22, exercises 1, 8, 14.
Inteligenta Artificiala
CCSB354 ARTIFICIAL INTELLIGENCE (AI)
1 CPE 480 Natural Language Processing Lecture 5: Parser Asst. Prof. Nuttanart Facundes, Ph.D.
For Monday Read chapter 23, sections 1-2 FOIL exercise due.
For Friday Finish chapter 23 Homework: –Chapter 22, exercise 9.
1 Statistical Parsing Chapter 14 October 2012 Lecture #9.
Propositional Logic: Logical Agents (Part I) This lecture topic: Propositional Logic (two lectures) Chapter (this lecture, Part I) Chapter 7.5.
Natural Language Processing Artificial Intelligence CMSC February 28, 2002.
1 Logical Agents CS 171/271 (Chapter 7) Some text and images in these slides were drawn from Russel & Norvig’s published material.
Class Project Due at end of finals week Essentially anything you want, so long as its AI related and I approve Any programming language you want In pairs.
Natural Language Sections What the Speaker Speaks §Intention l S wants H to believe P §Generation l S chooses the words, W to convey the.
1 Syntax In Text: Chapter 3. 2 Chapter 3: Syntax and Semantics Outline Syntax: Recognizer vs. generator BNF EBNF.
Context Free Grammars Reading: Chap 9, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Rada Mihalcea.
11 Chapter 14 Part 1 Statistical Parsing Based on slides by Ray Mooney.
For Wednesday Finish Chapter 22 Program 4 due. Program 4 Any questions?
For Wednesday Read chapter 23 Homework: –Chapter 22, exercises 1,4, 7, and 14.
Linguistic Essentials
Agents That Communicate Chris Bourne Chris Christen Bryan Hryciw.
Rules, Movement, Ambiguity
Chapter 3 Describing Syntax and Semantics
Natural Language - General
Logical Agents Chapter 7. Knowledge bases Knowledge base (KB): set of sentences in a formal language Inference: deriving new sentences from the KB. E.g.:
1 Logical Agents CS 171/271 (Chapter 7) Some text and images in these slides were drawn from Russel & Norvig’s published material.
For Friday No reading Program 4 due. Program 4 Any questions?
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Spring 2006-Lecture 2.
Natural Language Processing Slides adapted from Pedro Domingos
1 UNIT-3 KNOWLEDGE REPRESENTATION. 2 Agents that reason logically(Logical agents) A Knowledge based Agent The Wumpus world environment Representation,
NATURAL LANGUAGE PROCESSING
Introduction to Artificial Intelligence – Unit 9 Communication Course The Hebrew University of Jerusalem School of Engineering and Computer Science.
7.2 Programming Languages - An Introduction to Informatics WMN Lab. Hye-Jin Lee.
Introduction to Artificial Intelligence – Unit 10 Communication Course The Hebrew University of Jerusalem School of Engineering and Computer Science.
1 Knowledge Representation Logic and Inference Propositional Logic Vumpus World Knowledge Representation Logic and Inference Propositional Logic Vumpus.
Chapter 3 – Describing Syntax CSCE 343. Syntax vs. Semantics Syntax: The form or structure of the expressions, statements, and program units. Semantics:
King Faisal University جامعة الملك فيصل Deanship of E-Learning and Distance Education عمادة التعلم الإلكتروني والتعليم عن بعد [ ] 1 King Faisal University.
Context Free Grammars. Slide 1 Syntax Syntax = rules describing how words can connect to each other * that and after year last I saw you yesterday colorless.
Natural Language Processing Vasile Rus
Natural Language Processing Vasile Rus
Propositional Logic: Logical Agents (Part I)
Statistical NLP: Lecture 3
Semantics (Representing Meaning)
EA C461 – Artificial Intelligence Logical Agent
Natural Language - General
Logical Agents Chapter 7.
Logical Agents Chapter 7.
Linguistic Essentials
Presentation transcript:

Communication and Language Chap. 22

Outline Communication Grammar Syntactic analysis Problems

Communication “Classical" view (pre-1953): language consists of sentences that are true/false (cf. logic) “Modern" view (post-1953): language is a form of action Wittgenstein (1953) Philosophical Investigations Austin (1962) How to Do Things with Words Searle (1969) Speech Acts

Speech acts Speech acts achieve the speaker's goals: Inform “There's a pit in front of you" Query “Can you see the gold?" Command “Pick it up" Promise“I'll share the gold with you" Acknowledge“OK" Speech act planning requires knowledge of Situation Semantic and syntactic conventions Hearer's goals, knowledge base, and rationality

Stages in communication (informing) Intention S wants to inform H that P Generation S selects words W to express P in context C Synthesis S utters words W Perception H perceives W’ in context C’ Analysis H infers possible meanings P 1,… P n Disambiguation H infers intended meaning P i Incorporation H incorporates P i into KB

How could this go wrong? Insincerity (S doesn't believe P) Speech wreck ignition failure Ambiguous utterance Differing understanding of current context (C ≠C’)

Grammar Vervet monkeys, antelopes etc. use isolated symbols for sentences  restricted set of communicable propositions, no generative capacity (Chomsky (1957): Syntactic Structures) Grammar species the compositional structure of complex messages e.g., speech (linear), text (linear), music (two-dimensional) A formal language is a set of strings of terminal symbols Each string in the language can be analyzed/generated by the grammar The grammar is a set of rewrite rules, e.g., S →NP VP Article → the | a | an |… Here S is the sentence symbol, NP and VP are non-terminals

Grammar types Regular: nonterminal → terminal[nonterminal] S → aS S →Λ Context-free: nonterminal → anything S → aSb Context-sensitive: more nonterminals on right-hand side ASB → AAaBB Recursively enumerable: no constraints Related to Post systems and Kleene systems of rewrite rules Natural languages probably context-free, parsable in real time!

Wumpus lexicon Noun → stench | breeze | glitter | nothing | wumpus | pit | pits | gold | east | … Verb → is | see | smell | shoot | feel | stinks| go | grab | carry | kill | turn | … Ad|ective → right | left | east | south | back | smelly | … Adverb → here | there | nearby | ahead| right | left | east | south | back | … Pronoun → me | you | I | it | … Name → |ohn | Mary | Boston | UCB | PA|C | … Article → the | a | an | … Preposition → to | in | on | near | … Con|unction → and | or | but | … Digit → 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 Divided into closed and open classes

Wumpus grammar S → NP VP I + feel a breeze | S Conjunction S I feel a breeze + and + I smell a wumpus NP → Pronoun I | Noun pits | Article Noun the + wumpus | Digit Digit 3 4 | NP PP the wumpus + to the east | NP RelClause the wumpus + that is smelly VP → Verb stinks | VP NP feel + a breeze | VP Adjective is + smelly | VP PP turn + to the east | VP Adverb go + ahead PP → Preposition NP to + the east RelClause → that VP that + is smelly

Parse trees

Grammaticality judgements Formal language L 1 may differ from natural language L 2 Adjusting L1 to agree with L2 is a learning problem! * the gold grab the wumpus * I smell the wumpus the gold I give the wumpus the gold * I donate the wumpus the gold Intersubjective agreement somewhat reliable, independent of semantics! Real grammars 10{500 pages, insucient even for “proper" English

Syntax in NLP Most view syntactic structure as an essential step towards meaning; “Mary hit John" ≠ “John hit Mary" “And since I was not informed|as a matter of fact, since I did not know that there were excess funds until we, ourselves, in that checkup after the whole thing blew up, and that was, if you'll remember, that was the incident in which the attorney general came to me and told me that he had seen a memo that indicated that there were no more funds." “Wouldn't the sentence 'I want to put a hyphen between the words Fish and And and And and Chips in my Fish-And-Chips sign' have been clearer if quotation marks had been placed before Fish, and between Fish and and, and and and And, and And and and, and and and Chips, as well as after Chips?"

Context-free parsing Bottom-up parsing works by replacing any substring that matches RHS of a rule with the rule's LHS Ecient algorithms (e.g., chart parsing, Section 22.3) O(n3) for context-free, run at several thousand words/sec for real grammars Context-free parsing ≡Boolean matrix multiplication (Lee, 2002)  unlikely to find faster practical algorithms

Logical grammars BNF notation for grammars too restrictive: -difficult to add “side conditions" (number agreement, etc.) -difficult to connect syntax to semantics Idea: express grammar rules as logic X →YZ becomes Y (s 1 )  Z(s 2 )  X(Append(s 1, s 2 )) X → word becomes X([\word"]) X → Y | Z becomes Y (s)  X(s) Z(s)  X(s) Here, X(s) means that string s can be interpreted as an X

Logical grammars contd. Now it's easy to augment the rules NP(s1)  EatsBreakfast(Ref(s 1 ))  V P(s 2 )  NP(Append(s 1, [“who"], s2)) NP(s1)  Number(s 1, n)  V P(s 2 )  Number(s 2,n)  S(Append(s 1,s 2 )) Parsing is reduced to logical inference: Ask(KB, S([“I" “am" “a" “wumpus"])) (Can add extra arguments to return the parse structure, semantics) Generation simply requires a query with uninstantiated variables: Ask(KB, S(x)) If we add arguments to nonterminals to construct sentence semantics, NLP generation can be done from a given logical sentence: Ask(KB, S(x,At(Robot, [1, 1]))

Real language Real human languages provide many problems for NLP: ambiguity anaphora indexicality vagueness discourse structure metonymy metaphor noncompositionality

Ambiguity Squad helps dog bite victim Helicopter powered by human flies American pushes bottle up Germans I ate spaghetti with meatballs salad abandon a fork a friend Ambiguity can be lexical (polysemy), syntactic, semantic, referential

Anaphora Using pronouns to refer back to entities already introduced in the text After Mary proposed to John, they found a preacher and got married. For the honeymoon, they went to Hawaii Mary saw a ring through the window and asked John for it Mary threw a rock at the window and broke it

Indexicality Indexical sentences refer to utterance situation (place, time, S/H, etc.) I am over here Why did you do that?

Metonymy Using one noun phrase to stand for another Ex. I've read Shakespeare Chrysler announced a new model The ham sandwich on Table 4 wants another beer

Metaphor “Non-literal" usage of words and phrases, often systematic: Ex. I've tried killing the process but it won't die. Its parent keeps it alive.

Noncompositionality basketball shoes baby shoes alligator shoes designer shoes brake shoes red book red pen red hair red herring small moon large molecule mere child alleged murderer real leather articial grass