LING 364: Introduction to Formal Semantics Lecture 23 April 11th.

Slides:



Advertisements
Similar presentations
1.3 Predicates and Quantifiers
Advertisements

LING 388: Language and Computers Sandiway Fong Lecture 5: 9/5.
LING 364: Introduction to Formal Semantics Lecture 24 April 13th.
LING 388: Language and Computers Sandiway Fong Lecture 5: 9/8.
Semantics (Representing Meaning)
LING 364: Introduction to Formal Semantics Lecture 18 March 21st.
LING/C SC/PSYC 438/538 Computational Linguistics Sandiway Fong Lecture 13: 10/9.
Albert Gatt LIN3021 Formal Semantics Lecture 9. In this lecture Noun phrases as generalised quantifiers: some further concepts.
Basic Structures: Sets, Functions, Sequences, Sums, and Matrices
LING 364: Introduction to Formal Semantics Lecture 8 February 7th.
LING 364: Introduction to Formal Semantics Lecture 26 April 20th.
LING 438/538 Computational Linguistics Sandiway Fong Lecture 3: 8/29.
Outline Recap Knowledge Representation I Textbook: Chapters 6, 7, 9 and 10.
CS 106 Introduction to Computer Science I 11 / 26 / 2007 Instructor: Michael Eckmann.
LING 364: Introduction to Formal Semantics Lecture 10 February 14th.
LING 388 Language and Computers Lecture 2 9/04/03 Sandiway FONG.
LING 364: Introduction to Formal Semantics Lecture 9 February 9th.
LING 364: Introduction to Formal Semantics
LING 388: Language and Computers Sandiway Fong Lecture 12: 10/5.
LING 364: Introduction to Formal Semantics Lecture 17 March 9th.
1 Natural Language Processing for the Web Prof. Kathleen McKeown 722 CEPSR, Office Hours: Wed, 1-2; Tues 4-5 TA: Yves Petinot 719 CEPSR,
LING 364: Introduction to Formal Semantics Lecture 27 April 27th.
LING 364: Introduction to Formal Semantics Lecture 4 January 24th.
LING/C SC/PSYC 438/538 Computational Linguistics Sandiway Fong Lecture 6: 9/6.
Knowledge Representation using First-Order Logic (Part II) Reading: Chapter 8, First lecture slides read: Second lecture slides read:
LING 364: Introduction to Formal Semantics Lecture 11 February 16th.
LING 364: Introduction to Formal Semantics Lecture 19 March 28th.
LING 364: Introduction to Formal Semantics Lecture 22 April 6th.
LING 364: Introduction to Formal Semantics Lecture 2 January 17th.
LING 388: Language and Computers Sandiway Fong Lecture 6: 9/13.
LING 388: Language and Computers Sandiway Fong Lecture 4: 8/31.
LING 388: Language and Computers Sandiway Fong Lecture 3: 8/29.
LING 438/538 Computational Linguistics Sandiway Fong Lecture 6: 9/7.
LING 364: Introduction to Formal Semantics Lecture 19 March 20th.
LING 364: Introduction to Formal Semantics Lecture 3 January 19th.
LING 364: Introduction to Formal Semantics Lecture 13 February 23rd.
Please CLOSE YOUR LAPTOPS, and turn off and put away your cell phones, and get out your note-taking materials. Today’s Gateway Test will be given during.
Today’s quiz on 8.2 A Graphing Worksheet 1 will be given at the end of class. You will have 12 minutes to complete this quiz, which will consist of one.
LING 364: Introduction to Formal Semantics Lecture 5 January 26th.
LING 388: Language and Computers Sandiway Fong Lecture 13: 10/10.
LING 364: Introduction to Formal Semantics Lecture 25 April 18th.
CMPS 3223 Theory of Computation Automata, Computability, & Complexity by Elaine Rich ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Slides provided.
LING 388: Language and Computers Sandiway Fong Lecture 4.
LING 388: Language and Computers Sandiway Fong Lecture 7.
1st-order Predicate Logic (FOL)
LING 581: Advanced Computational Linguistics Lecture Notes March 8th.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Spring 2006-Lecture 8.
CSE 311 Foundations of Computing I Lecture 7 Logical Inference Autumn 2012 CSE
Chapter 2 With Question/Answer Animations. Section 2.1.
Albert Gatt LIN3021 Formal Semantics Lecture 8. In this lecture Noun phrases as generalised quantifiers.
Lecture 7: Foundations of Query Languages Tuesday, January 23, 2001.
LING 364: Introduction to Formal Semantics Lecture 21 April 4th.
Module #3 - Sets 3/2/2016(c) , Michael P. Frank 2. Sets and Set Operations.
Quantified sentences TP NPT’ Det N’ every man smokes i.|| man || t (u) = 1 iff u  {x: x is a man in t} ii.|| smoke || t (u) = 1 iff u  {x: x smokes in.
Chapter 2 1. Chapter Summary Sets The Language of Sets - Sec 2.1 – Lecture 8 Set Operations and Set Identities - Sec 2.2 – Lecture 9 Functions and sequences.
Lecture 9: Query Complexity Tuesday, January 30, 2001.
Knowledge Representation Lecture 2 out of 5. Last Week Intelligence needs knowledge We need to represent this knowledge in a way a computer can process.
LING 581: Advanced Computational Linguistics Lecture Notes May 4th.
TOEFL iBT Strategies Week 1. Listening “It’s so hard! They talk too fast, and use too many new words. I’ll never improve!”
CPSC 121: Models of Computation 2008/9 Winter Term 2
Matrix Chain Multiplication
LING 581: Advanced Computational Linguistics
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 20
LING 581: Advanced Computational Linguistics
Matrix Chain Multiplication
Chapter 9 Review NEW!.
1st-order Predicate Logic (FOL)
First-order (predicate) Logic
1st-order Predicate Logic (FOL)
Presentation transcript:

LING 364: Introduction to Formal Semantics Lecture 23 April 11th

Administrivia Homework 4 –graded –you should get it back today

Administrivia this Thursday –computer lab class –help with homework 5 –meet in SS 224

Today’s Topics Homework 4 Review Finish with –Chapter 6: Quantifiers –Quiz 5 (end of class)

Homework 4 Review Questions 1 and 2 Worlds –w1 → {A,B}horse(a). horse(b). –w2 → {B,C}horse(b). horse(c). –w3 → {A,B,C}horse(a). horse(b). horse(c). –w4 → ∅ :- dynamic horse/1. – or?- assert(horse(a)), retract(horse(a)). – or?- set_prolog_flag(unknown,fail). –w5 → {A,B,C,D,E} horse(a). horse(b). horse(c). horse(d). – horse(e). –w6 → {A,B,C,D,E,F}horse(a). horse(b). horse(c). horse(d). – horse(e). horse(f).

Homework 4 Review Prolog definitions common to worlds W 1,..,W 6 : –horses(Sum) :- –findall(X,horse(X),L), –sum(L,Sum). –sum(L,X+Y) :- pick(X,L,Lp), pick(Y,Lp,_). –sum(L,X+Sum) :- pick(X,L,Lp), sum(Lp,Sum). –pick(X,[X|L],L). –pick(X,[_|L],Lp) :- pick(X,L,Lp).

Homework 4 Review Questions 1 and 2 Answers to the query –?- findall(PL,horses(PL),L), length(L,N). –w1 → {A,B}L = [a+b]N=1 –w2 → {B,C}L = [b+c]N=1 –w3 → {A,B,C}L = [a+b,a+c,b+c,a+(b+c)N=4 –w4 → ∅ L = [] N=0 –w5 → {A,B,C,D,E} L = [a+b,a+c,a+d,a+e,b+c,b+d,b+e,c+d,c+e,d+e, – total: 26 a+(b+c),a+(b+d),a+(b+e),a+(c+d),a+(c+e),a+(d+e), – a+(b+(c+d)),a+(b+(c+e)),a+(b+(d+e)), – a+(b+(c+(d+e))),a+(c+(d+e)),b+(c+d), – b+(c+e),b+(d+e),b+(c+(d+e)),c+(d+e)] –2 horses = 5 C 2 = 5!/(2!3!) =10 4 horses = 5 C 4 = 5!/(4!1!) =5 –3 horses = 5 C 3 = 5!/(3!2!) =10 5 horses = 1

Homework 4 Review Questions 1 and 2 Answers to the query –?- findall(PL,horses(PL),L), length(L,N). –w6 → {A,B,C,D,E,F} –L=[a+b,a+c,a+d,a+e,a+f,b+c,b+d,b+e,b+f,c+d,c+e,c+f,d+e,d+f,e+f, –a+(b+c),a+(b+d),a+(b+e),a+(b+f),a+(c+d),a+(c+e),a+(c+f),a+(d+e),a+(d+f),a+(e+f), –a+(b+(c+d)),a+(b+(c+e)),a+(b+(c+f)),a+(b+(d+e)),a+(b+(d+f)),a+(b+(e+f)), –a+(b+(c+(d+e))),a+(b+(c+(d+f))),a+(b+(c+(e+f))), –a+(b+(c+(d+(e+f)))), –a+(b+(d+(e+f))), –a+(c+(d+e)),a+(c+(d+f)),a+(c+(e+f)), –a+(c+(d+(e+f))), –a+(d+(e+f)), –b+(c+d),b+(c+e),b+(c+f),b+(d+e),b+(d+f),b+(e+f), –b+(c+(d+e)),b+(c+(d+f)),b+(c+(e+f)), –b+(c+(d+(e+f))), –b+(d+(e+f)), –c+(d+e),c+(d+f),c+(e+f),c+(d+(e+f)),d+(e+f)] 2 horses = 6 C 2 = 6!/(2!4!) =15 3 horses = 6 C 3 = 6!/(3!3!) =20 4 horses = 6 C 4 = 6!/(4!2!) =15 5 horses = 6 C 5 = 6!/(5!1!) =6 6 horses = 6 C 1 = 1 Total: 57

Homework 4 Review

Question 3: –What is the Prolog query for “three horses”? Answer –Notice all cases of threes are of pattern/form _+( _+_ ) where each _ represents an individual horse e.g. a+(b+c),a+(b+d),a+(b+e),a+(c+d),a+(c+e),a+(d+e) –Query (1st attempt) ?- findall(PL,(horses(PL),PL=_+( _+_)),L). –Example (W 5 ) L=[a+(b+c),a+(b+d),a+(b+e),a+(c+d),a+(c+e),a+(d+e),a+(b+(c+ d)),a+(b+(c+e)),a+(b+(d+e)),a+(b+(c+(d+e))),a+(c+(d+e)),b+(c+ d),b+(c+e),b+(d+e),b+(c+(d+e)),c+(d+e)] Total: 16 (but answer should be 10!)

Homework 4 Review Question 3: –What is the Prolog query for “three horses”? Answer –Query (2nd attempt) ?- findall(PL,(horses(PL),PL=_+( _+H),\+H=_+_),L), length(L,N). –Example (W 5 ) L=[a+(b+c),a+(b+d),a+(b+e),a+(c+d),a+(c+e),a+(d+e),b+(c+d),b +(c+e),b+(d+e),c+(d+e)] N=10 (correct)

Homework 4 Review Question 3: –What is the Prolog query for “three horses”? Another way to deal with the question: –recognize that notation Horse+Sum from the given definition: –sum(L,X+Y) :- pick(X,L,Lp), pick(Y,Lp,_). –sum(L,X+Sum) :- pick(X,L,Lp), sum(Lp,Sum). –is isomorphic to [Head|Tail] list notation (here: + is equivalent to |) Write a recursive length predicate, call it len/2, for Horse+Sum –len(_+Sum,N) :- !, len(Sum,M), N is M+1. –len(_,1). Query becomes: –?- findall(PL,(horses(PL),len(PL,3)),L).

Homework 4 Review Question 4: –How would you write the query for “the three horses”? Clue (given in lecture slides) –?- findall(X,dog(X),List), length(List,1). –encodes the definite description “the dog” i.e. query holds (i.e. is true) when dog(X) is true and there is a unique X in a given world Combine this clue with the answer to Question 3 Resulting Query –?- findall(PL,(horses(PL),PL=_+( _+H),\+H=_+_),L), length(L,1). –Under the assumption that everything is equally salient, query is true for world W 3 only! –L = [a+(b+c)] –Worlds W 1,W 2 and W 4 have too few horses, and worlds W 5 and W 6 have too many.

Back to Chapter 6

Negative Polarity Items Negative Polarity Items (NPIs) Examples: –ever, any Constrained distribution: –have to be licensed in some way –grammatical in a “negated environment” or “question” Examples: –(13a) Shelby won’t ever bite you –(13b) Nobody has any money –(14a) *Shelby will ever bite you –(14b) *Noah has any money –*= ungrammatical –(15a) Does Shelby ever bite? –(15b) Does Noah have any money?

Negative Polarity Items Inside an if-clause: –(16a) If Shelby ever bites you, I’ll put him up for adoption –(16b) If Noah has any money, he can buy some candy Inside an every-NP: –(17a) Every dog which has ever bitten a cat feels the admiration of other dogs –(17b) Every child who has any money is likely to waste it on candy Not inside a some-NP: –(17a) Some dog which has ever bitten a cat feels the admiration of other dogs –(17b) Some child who has any money is likely to waste it on candy Not to be confused with free choice (FC) any (meaning: ∀ ): any man can do that

Downwards and Upwards Entailment (DE & UE) –class ⇄ super-class Example: –hyponym ⇄ hypernym –dog ⇄ animal –Keeshond ⇄ dog Inferencing: –non-negative sentence: upwards –(23) I have a dog(entails) –(23b) I have an animal – I have a Keeshond(invalid inference) –negative sentence: downwards –(24a) I don’t have a dog (entails) –(24b) I don’t have a Keeshond – I don’t have an animal(invalid inference) animal dog Keeshond

Downwards and Upwards Entailment (DE & UE) Quantifier every has semantics –{X: P 1 (X)} ⊆ {Y: P 2 (Y)} –e.g. every woman likes ice cream –{X: woman(X)} ⊆ {Y:likes(Y,ice_cream)} Every is DE for P 1 and UE for P 2 Examples: (25) a. Every dog barks b. Every Keeshond barks (valid) c. Every animal barks (invalid) –semantically, “Keeshond” is a sub-property or subset with respect to the set “dog” animal dog Keeshond animal dog Keeshond

Downwards and Upwards Entailment (DE & UE) Quantifier every has semantics –{X: P 1 (X)} ⊆ {Y: P 2 (Y)} –e.g. every woman likes ice cream –{X: woman(X)} ⊆ {Y:likes(Y,ice_cream)} Every is DE for P 1 and UE for P 2 Examples: (25) a. Every dog barks d. Every dog barks loudly (invalid) c. Every dog makes noise (valid) –semantically, “barks loudly” is a subset with respect to the set “barks”, which (in turn) is a subset of the set “makes noise” make noise barks barks loudly make noise barks loud

Downwards and Upwards Entailment (DE & UE) Inferencing: –non-negative sentence: UE –(23) I have a dog(entails) –(23b) I have an animal – I have a Keeshond (invalid) –negative sentence: DE –(24a) I don’t have a dog (entails) –(24b) I don’t have a Keeshond – I don’t have an animal (invalid) NPI-Licensing: –non-negative sentence: UE –(14a) *Shelby will ever bite you –(14b) *Noah has any money –negative sentence: DE –(13a) Shelby won’t ever bite you –(13b) Nobody has any money Generalization: NPIs like ever and any are licensed by DE

Downwards and Upwards Entailment (DE & UE) Inside an every-NP: –(17a) [Every [dog][which has ever bitten a cat]] feels the admiration of other dogs –(17b) [Every [child][who has any money]] is likely to waste it on candy Explanation: –every is DE for P 1 and UE for P 2 –{X: P 1 (X)} ⊆ {Y: P 2 (Y)} Inside an every-NP: –(17a) P 1 = [dog][which has ever bitten a cat] –(17b) P 1 = [child][who has any money Generalization: NPIs like ever and any are licensed by DE

Quiz 5 Question 1: Is Some UE or DE for P 1 and P 2 ? –Lecture 22 (Homework 5 Question 3) some: {X: P 1 (X)} ∩ {Y: P 2 (Y)} ≠ ∅ –Justify your answer using examples of valid/invalid inferences starting from Some dog barks Question 2: Is No UE or DE for P 1 and P 2 ? –Lecture 22 (Homework 5 Question 3) no: {X: P 1 (X)} ∩ {Y: P 2 (Y)} = ∅ –Use No dog barks