Fall 2005 Lecture Notes #6 EECS 595 / LING 541 / SI 661 Natural Language Processing.

Slides:



Advertisements
Similar presentations
Artificial Intelligence
Advertisements

1 Knowledge Representation Introduction KR and Logic.
March 1, 2009 Dr. Muhammed Al-Mulhem 1 ICS 482 Natural Language Processing Semantics (Chapter 17) Muhammed Al-Mulhem March 1, 2009.
ICE1341 Programming Languages Spring 2005 Lecture #6 Lecture #6 In-Young Ko iko.AT. icu.ac.kr iko.AT. icu.ac.kr Information and Communications University.
Semantics (Representing Meaning)
CMSC 723: Intro to Computational Linguistics November 24, 2004 Lecture 12: Lexical Semantics Bonnie Dorr Christof Monz.
Propositional Logic Reading: C , C Logic: Outline Propositional Logic Inference in Propositional Logic First-order logic.
Syntactic analysis using Context Free Grammars. Analysis of language Morphological analysis – Chairs, Part Of Speech (POS) tagging – The/DT man/NN left/VBD.
Natural Language Processing Lecture 22: Meaning Representation Languages.
Knowledge Representation Methods
Computational Semantics Ling 571 Deep Processing Techniques for NLP February 7, 2011.
NLP and Speech 2004 Semantics I Semantics II
Outline Recap Knowledge Representation I Textbook: Chapters 6, 7, 9 and 10.
6/3/2015CPSC503 Winter CPSC 503 Computational Linguistics Lecture 10 Giuseppe Carenini.
Meaning Representation and Semantic Analysis Ling 571 Deep Processing Techniques for NLP February 9, 2011.
Introduction to Semantics To be able to reason about the meanings of utterances, we need to have ways of representing the meanings of utterances. A formal.
CS 4705 Semantics: Representations and Analyses. What kinds of meaning do we want to capture? Categories/entities –IBM, Jane, a black cat, Pres. Bush.
1 Chapter 7 Propositional and Predicate Logic. 2 Chapter 7 Contents (1) l What is Logic? l Logical Operators l Translating between English and Logic l.
Knowledge Representation I (Propositional Logic) CSE 473.
Semantics Ling 571 Fei Xia Week 6: 11/1-11/3/05. Outline Meaning representation: what formal structures should be used to represent the meaning of a sentence?
Let remember from the previous lesson what is Knowledge representation
Categories – relations or individuals? What are the differences in representing collie as a relation vs. an individual? As a relation: collie(lassie) –
Fall 2004 Lecture Notes #7 EECS 595 / LING 541 / SI 661 Natural Language Processing.
Fall 2004 Lecture Notes #5 EECS 595 / LING 541 / SI 661 Natural Language Processing.
Represent the following sentences in first-order logic, using a consistent vocabulary
Stochastic POS tagging Stochastic taggers choose tags that result in the highest probability: P(word | tag) * P(tag | previous n tags) Stochastic taggers.
Categories – relations or individuals? What are the differences in representing collie as a relation vs. an individual? As a relation: collie(lassie) –
11 CS 388: Natural Language Processing: Syntactic Parsing Raymond J. Mooney University of Texas at Austin.
BİL711 Natural Language Processing
9/8/20151 Natural Language Processing Lecture Notes 1.
Representing Meaning Lecture Sep 2007.
Chapter 14. Representing Meaning From: Chapter 14 of An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition,
1 Chapter 7 Propositional and Predicate Logic. 2 Chapter 7 Contents (1) l What is Logic? l Logical Operators l Translating between English and Logic l.
10/14/2015CPSC503 Winter CPSC 503 Computational Linguistics Lecture 11 Giuseppe Carenini.
Context-Free Parsing Read J & M Chapter 10.. Basic Parsing Facts Regular LanguagesContext-Free Languages Required Automaton FSMPDA Algorithm to get rid.
1 Natural Language Processing Lecture Notes 11 Chapter 15 (part 1)
SEMANTIC ANALYSIS WAES3303
CMPF144 FUNDAMENTALS OF COMPUTING THEORY Module 5: Classical Logic.
Pattern-directed inference systems
Dr. Shazzad Hosain Department of EECS North South Universtiy Lecture 04 – Part A Knowledge Representation and Reasoning.
First Order Logic Lecture 2: Sep 9. This Lecture Last time we talked about propositional logic, a logic on simple statements. This time we will talk about.
Lecture 1, 7/21/2005Natural Language Processing1 CS60057 Speech &Natural Language Processing Autumn 2007 Lecture August 2007.
1 Natural Language Processing Chapter Transition First we did words (morphology) Then we looked at syntax Now we’re moving on to meaning.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Spring 2006-Lecture 8.
Basic Parsing Algorithms: Earley Parser and Left Corner Parsing
UBC Department of Computer Science Undergraduate Events More
CS 4705 Lecture 10 The Earley Algorithm. Review Top-Down vs. Bottom-Up Parsers –Both generate too many useless trees –Combine the two to avoid over-generation:
1 CSC384: Intro to Artificial Intelligence Lecture 5.  Knowledge Representation.
NLP. Introduction to NLP Pros –Compositional –Declarative Cons –Limited expressive power –Represents facts.
Chapter 7. Propositional and Predicate Logic Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Artificial Intelligence First-Order Predicate Logic - First-Order Predicate Logic (FOL or FOPL), also called First-Order Predicate Calculus.
1 UNIT-3 KNOWLEDGE REPRESENTATION. 2 Agents that reason logically(Logical agents) A Knowledge based Agent The Wumpus world environment Representation,
PARSING David Kauchak CS159 – Fall Admin Assignment 3 Quiz #1  High: 36  Average: 33 (92%)  Median: 33.5 (93%)
NLP. Introduction to NLP What is the meaning of: (5+2)*(4+3)? Parse tree N N N N + + E E E E F F* FE E E 49.
Artificial Intelligence Logical Agents Chapter 7.
Propositional Logic: Logical Agents (Part I)
Chapter 7. Propositional and Predicate Logic
NLP.
Semantics (Representing Meaning)
Representations of Meaning
بسم الله الرحمن الرحيم ICS 482 Natural Language Processing
Semantics September 23, /22/2018.
Probabilistic and Lexicalized Parsing
Representing Meaning Lecture Sep 2007.
CPSC 503 Computational Linguistics
Chapter 7. Propositional and Predicate Logic
Introduction to Computational Linguistics
Presentation transcript:

Fall 2005 Lecture Notes #6 EECS 595 / LING 541 / SI 661 Natural Language Processing

Lexicalized and probabilistic parsing

Probabilistic CFG G (N, Σ, P, S) Non-terminals (N) Terminals (Σ) Productions (P) augmented with probabilities: A  β [p]

Disambiguation as a probability problem P(T,S) = P(T) P(S|T) = P(T) P(T l ) =.15 *.40 *.05 *.05 *.35 *.75 *.40 *.40 *.40 *.30 *.40 *.50 = 1.5 x P( Tr ) =.15 *.40 *.40 *.05 *.05 *.75 *.40 *.40 *.40 *.30 *.40 *.50 = 1.7 x 10 -6

Probabilistic parsing Probabilistic Earley algorithm –Top-down parser with a dynamic programming table Cocke-Younger-Kasami (CYK) algorithm –Bottom-up parser with a dynamic programming table Probabilities come from a Treebank.

Probabilistic CYK

Dependency grammars Lexical dependencies between head words Top-level predicate of a sentence is the root Useful for free word order languages Also simpler to parse

Dependencies John likes tabby cats NNPVBSJJNNS NP VP NP S

Representing Meaning

Introduction Meaning representation languages: capturing the meaning of linguistic utterances using formal notation Example: deciding what to order at a restaurant by reading a menu Example: answering a question on an exam Semantic analysis: mapping between language and real life I have a car: ∃ x,y: Having(x) ^ Haver(speaker,x) ^ HadThing(y,x) ^ Car(y)

Verifiability Example: Does LeDog serve vegetarian food? Knowledge base (KB) Sample entry in KB: Serves(LeDog,Vegetarian Food) Convert question to logical form and verify its truth value against the knowledge base

Unambiguousness Example: I want to eat some place near UM. (multiple interpretations) Interpretation is important Preferred interpretations Vagueness: I want to eat Italian food. - what particular food?

Canonical form Does LeDog have vegetarian dishes? Do they have vegetarian food at LeDog? Are vegetarian dishes served at LeDog? Does LeDog serve vegetarian fare? Having vs. serving Food vs. fare vs. dishes (each is ambiguous but one sense of each matches the others) word sense disambiguation

Inference and variables; expressiveness Inference and variables: –I’d like to find a restaurant that serves vegetarian food. –Serves (x,VegetarianFood) –System’s ability to draw valid conclusions based on the meaning representations of inputs and its store of background knowledge. Expressiveness: –system must be able to handle a wide range of subject matter

Predicate-argument structure I want Italian food. NP want NP I want to spend less than five dollars. NP want Inf-VP I want it to be close by here. NP want NP Inf-VP Thematic roles: e.g. entity doing the wanting vs. entity that is wanted (linking surface arguments with the semantic=case roles) Syntactic selection restrictions: I found to fly to Dallas. Semantic selection restrictions: The risotto wanted to spend less than ten dollars. Make a reservation for this evening for a table for two persons at eight: Reservation (Hearer,Today,8PM,2)

First-order predicate calculus (FOPC) Formula  AtomicFormula | Formula Connective Formula | Quantifier Variable … Formula | ¬ Formula | (Formula) AtomicFormula  Predicate (Term…) Term  Function (Term…) | Constant | Variable Connective  ∧ | ⋁ | ⇒ Quantifier  ∀ | ∃ Constant  A | VegetarianFood | LeDog Variable  x | y | … Predicate  Serves | Near | … Function  LocationOf | CuisineOf | …

Example I only have five dollars and I don’t have a lot of time. Have(Speaker,FiveDollars) ∧ ¬ Have(Speaker,LotOfTime) variables: –Have(x,FiveDollars) ∧ ¬ Have(x,LotOfTime) Note: grammar is recursive

Semantics of FOPC FOPC sentences can be assigned a value of true or false. LeDog is near UM. Near(LocationOf(LeDog),LocationOf(UM))

Variables and quantifiers A restaurant that serves Mexican food near UM. ∃ x: Restaurant(x) ∧ Serves(x,MexicanFood) ∧ Near(LocationOf(x),LocationOf(UM)) All vegetarian restaurants serve vegetarian food.  x: VegetarianRestaurant(x) ⇒ Serves (x,VegetarianFood) If this sentence is true, it is also true for any substitution of x. However, if the condition is false, the sentence is always true.

Inference Modus ponens:   ⇒   Example: VegetarianRestaurant(Joe’s)  x: VegetarianRestaurant(x) ⇒ Serves(x,VegetarianFood) Serves(Joe’s,VegetarianFood)

Uses of modus ponens Forward chaining: as individual facts are added to the database, all derived inferences are generated Backward chaining: starts from queries. Example: the Prolog programming language father(X, Y) :- parent(X, Y), male(X). parent(john, bill). parent(jane, bill). female(jane). male (john). ?- father(M, bill).

Examples from Russell&Norvig (1) 7.2. p.213 Not all students take both History and Biology. Only one student failed History. Only one student failed both History and Biology. The best history in History was better than the best score in Biology. Every person who dislikes all vegetarians is smart. No person likes a smart vegetarian. There is a woman who likes all men who are vegetarian. There is a barber who shaves all men in town who don't shave themselves. No person likes a professor unless a professor is smart. Politicians can fool some people all of the time or all people some of the time but they cannot fool all people all of the time.

Categories & Events Categories: –VegetarianRestaurant (Joe’s) – categories are relations and not objects –MostPopular(Joe’s,VegetarianRestaurant) – not FOPC! –ISA (Joe’s,VegetarianRestaurant) – reification (turn all concepts into objects) –AKO (VegetarianRestaurant,Restaurant) Events: –Reservation (Hearer,Joe’s,Today,8PM,2) –Problems: Determining the correct number of roles Representing facts about the roles associated with an event Ensuring that all the correct inferences can be drawn Ensuring that no incorrect inferences can be drawn

MUC-4 Example INCIDENT: DATE30 OCT 89 INCIDENT: LOCATIONEL SALVADOR INCIDENT: TYPEATTACK INCIDENT: STAGE OF EXECUTIONACCOMPLISHED INCIDENT: INSTRUMENT ID INCIDENT: INSTRUMENT TYPE PERP: INCIDENT CATEGORYTERRORIST ACT PERP: INDIVIDUAL ID"TERRORIST" PERP: ORGANIZATION ID "THE FMLN" PERP: ORG. CONFIDENCEREPORTED: "THE FMLN" PHYS TGT: ID PHYS TGT: TYPE PHYS TGT: NUMBER PHYS TGT: FOREIGN NATION PHYS TGT: EFFECT OF INCIDENT PHYS TGT: TOTAL NUMBER HUM TGT: NAME HUM TGT: DESCRIPTION"1 CIVILIAN" HUM TGT: TYPE CIVILIAN: "1 CIVILIAN" HUM TGT: NUMBER1: "1 CIVILIAN" HUM TGT: FOREIGN NATION HUM TGT: EFFECT OF INCIDENTDEATH: "1 CIVILIAN" HUM TGT: TOTAL NUMBER On October 30, 1989, one civilian was killed in a reported FMLN attack in El Salvador.

Subcategorization frames 1.I ate 2.I ate a turkey sandwich 3.I ate a turkey sandwich at my desk 4.I ate at my desk 5.I ate lunch 6.I ate a turkey sandwich for lunch 7.I ate a turkey sandwich for lunch at my desk - no fixed “arity” (problem for FOPC)

One possible solution 1.Eating1 (Speaker) 2.Eating2 (Speaker, TurkeySandwich) 3.Eating3 (Speaker, TurkeySandwich, Desk) 4.Eating4 (Speaker, Desk) 5.Eating5 (Speaker, Lunch) 6.Eating6 (Speaker, TurkeySandwich, Lunch) 7.Eating7 (Speaker, TurkeySandwich, Lunch, Desk) Meaning postulates are used to tie semantics of predicates:  w,x,y,z: Eating7(w,x,y,z) ⇒ Eating6(w,x,y) Scalability issues again!

Another solution - Say that everything is a special case of Eating7 with some arguments unspecified: ∃ w,x,y Eating (Speaker,w,x,y) - Two problems again: -Too many commitments (e.g., no eating except at meals: lunch, dinner, etc.) -No way to individuate events: ∃ w,x Eating (Speaker,w,x,Desk) ∃ w,y Eating (Speaker,w,Lunch,y) – cannot combine into ∃ w Eating (Speaker,w,Lunch,Desk)

Reification ∃ w: Isa(w,Eating) ∧ Eater(w,Speaker) ∧ Eaten(w,TurkeySandwich) – equivalent to sentence 5. Reification: –No need to specify fixed number of arguments for a given surface predicate –No more roles are postulated than mentioned in the input –No need for meaning postulates to specify logical connections among closely related examples

Representing time 1.I arrived in New York 2.I am arriving in New York 3.I will arrive in New York ∃ w: Isa(w,Arriving) ∧ Arriver(w,Speaker) ∧ Destination(w,NewYork)

Representing time ∃ i,e,w,t: Isa(w,Arriving) ∧ Arriver(w,Speaker) ∧ Destination(w,NewYork) ∧ IntervalOf(w,i) ∧ EndPoint(I,e) ∧ Precedes (e,Now) ∃ i,e,w,t: Isa(w,Arriving) ∧ Arriver(w,Speaker) ∧ Destination(w,NewYork) ∧ IntervalOf(w,i) ∧ MemberOf(i,Now) ∃ i,e,w,t: Isa(w,Arriving) ∧ Arriver(w,Speaker) ∧ Destination(w,NewYork) ∧ IntervalOf(w,i) ∧ StartPoint(i,s) ∧ Precedes (Now,s)

Representing time We fly from San Francisco to Boston at 10. Flight 1390 will be at the gate an hour now. –Use of tenses Flight 1902 arrived late. Flight 1902 had arrived late. –“similar” tenses When Mary’s flight departed, I ate lunch When Mary’s flight departed, I had eaten lunch –reference point

Aspect Stative: I know my departure gate Activity: John is flying no particular end point Accomplishment: Sally booked her flight natural end point and result in a particular state Achievement: She found her gate Figuring out statives: * I am needing the cheapest fare. * I am wanting to go today. * Need the cheapest fare!

Representing beliefs Want, believe, imagine, know - all introduce hypothetical worlds I believe that Mary ate British food. Reified example: –∃ u,v: Isa(u,Believing) ∧ Isa(v,Eating) ∧ Believer (u,Speaker) ∧ BelievedProp(u,v) ∧ Eater(v,Mary) ∧ Eaten(v,BritishFood) However this implies also: –∃ u,v: Isa(v,Eating) ∧ Eater(v,Mary) ∧ Eaten(v,BritishFood) Modal operators: –Believing(Speaker,Eating(Mary,BritishFood)) - not FOPC! – predicates in FOPC hold between objects, not between relations. –Believes(Speaker, ∃ v: ISA(v,Eating) ∧ Eater(v,Mary) ∧ Eaten(v,BritishFood))

Modal operators Beliefs Knowledge Assertions Issues: If you are interested in baseball, the Red Sox are playing tonight.

Examples from Russell&Norvig (2) 7.3. p.214 One more outburst like that and you'll be in comptempt of court. Annie Hall is on TV tonight if you are interested. Either the Red Sox win or I am out ten dollars. The special this morning is ham and eggs. Maybe I will come to the party and maybe I won't. Well, I like Sandy and I don't like Sandy.