Presentation is loading. Please wait.

Presentation is loading. Please wait.

Lecture 20a Feature Based Grammars

Similar presentations


Presentation on theme: "Lecture 20a Feature Based Grammars"— Presentation transcript:

1 Lecture 20a Feature Based Grammars
CSCE Natural Language Processing Lecture 20a Feature Based Grammars Topics Description Logic III Overview of Meaning Readings: Text Chapter 18 NLTK book Chapter 10 March 28, 2013

2 Overview Readings: Text 18.3-18.5 NLTK Book: Chapters 9 and 10
Last Time (Programming) Computational Semantics Today Feature based grammars Readings: Text NLTK Book: Chapters 9 and 10 Next Time: Computational Lexical Semantics

3 Automated Reasoning Services
• Satisfiability: A concept C is satisfiable with respect to T if there exists a model I of T such that CI ≠ ∅. We also say that I is a model of C. • Subsumption: A concept C1 is subsumed by a concept C2 with respect to T if CI ⊆ CI for every model I of T . We also write C1 ⊑T C2 or T |= C1 ⊑ C2. • Equivalence: Two concepts C1 and C2 are equivalent with respect to T if C1I = C2I for every model I of T . We also write C1 ≡T C2 or T |= C1 ≡ C2. • Disjointness: Two concepts C1 and C2 are disjoint with respect to T if C1I ⊓ C2I = ∅ for every model I of T Chuming Chen’s Dissertation 2008

4 Chuming Chen’s Dissertation 2008
Abox Reasoning ABox consistency checking: An ABox A is consistent with respect to a TBox T if there exists an interpretation I that is a model of both T and A. • Instance checking: An individual a is an instance of concept C with respect to T and A if aI ⊆ CI for every model of T and A. • Retrieval problem: Given an ABox A and a concept C, to find all individuals a such that A |= a : C. • Realization problem: Given an individual a and a set of concepts, find the most specific concepts C from the set such that A |= a : C. Note, the most specific concepts are those that are minimal with respect to the subsumption ordering ⊑. Chuming Chen’s Dissertation 2008

5 Figure 18.5 Quantifier Scope and Ambiguity
Every Restaurant has a menu. Two possible meanings Speech and Language Processing, Second Edition Daniel Jurafsky and James H. Martin Copyright ©2009 by Pearson Education, Inc. Upper Saddle River, New Jersey 07458 All rights reserved.

6 UnderSpecification Underspecification is supporting “ambiguous” meanings by leaving unspecified aspects unspecified So we need to be able to “Create underspecified representations that embody all possible readings without explicitly enumerating them” Extract the readings if necessary Choose amongst those readings Haver(e,Restaurant) ^ had(e, Menu) “it should remain agonstic about the placement of qualifiers”

7 Stores Cooper storage (1983) For meanings of nodes of parse tree we have been using predicate calculus (FOL) formulae In Cooper’s approach we replace the single formula with a “store” consisting of a list of quantified expressions gathered from below

8 Figure 18.6 Semantic Stores for VP
Speech and Language Processing, Second Edition Daniel Jurafsky and James H. Martin Copyright ©2009 by Pearson Education, Inc. Upper Saddle River, New Jersey 07458 All rights reserved.

9 Hole Semantics Hole semantics (Bos 1996) λ - reductions
replace λ-variables with “holes” Instead of using λ – reductions we just create labels (“holes”) Dominance constraints Plugging the holes

10 Figure 18.7 Hole Semantic Repr. for Every Restaurant has a menu.
Speech and Language Processing, Second Edition Daniel Jurafsky and James H. Martin Copyright ©2009 by Pearson Education, Inc. Upper Saddle River, New Jersey 07458 All rights reserved.

11 Figure 18.8 Speech and Language Processing, Second Edition
Daniel Jurafsky and James H. Martin Copyright ©2009 by Pearson Education, Inc. Upper Saddle River, New Jersey 07458 All rights reserved.

12 Feature and Unification based approaches
Feature structures associated with nodes of the parse tree and performing unifications

13 Exists e Closing(e) ^ Closed(e, Rhumba)
Speech and Language Processing, Second Edition Daniel Jurafsky and James H. Martin Copyright ©2009 by Pearson Education, Inc. Upper Saddle River, New Jersey 07458 All rights reserved.

14 Figure 18.9 DAG for Semantic Features
Speech and Language Processing, Second Edition Daniel Jurafsky and James H. Martin Copyright ©2009 by Pearson Education, Inc. Upper Saddle River, New Jersey 07458 All rights reserved.

15 Speech and Language Processing, Second Edition
Daniel Jurafsky and James H. Martin Copyright ©2009 by Pearson Education, Inc. Upper Saddle River, New Jersey 07458 All rights reserved.

16 Speech and Language Processing, Second Edition
Daniel Jurafsky and James H. Martin Copyright ©2009 by Pearson Education, Inc. Upper Saddle River, New Jersey 07458 All rights reserved.

17 Figure 18.10 Earley + Semantics
Speech and Language Processing, Second Edition Daniel Jurafsky and James H. Martin Copyright ©2009 by Pearson Education, Inc. Upper Saddle River, New Jersey 07458 All rights reserved.

18 Problem with SQL and Agents

19 Consistency of Statements
(5) a.Sylvania is to the north of Freedonia. b.Freedonia is a republic. (6) a.The capital of Freedonia has a population of 9,000. b.No city in Freedonia has a population of 9,000. (7) b.Freedonia is to the north of Sylvania. NLTK Book Chapter 10

20 Models again “A model for a set W of sentences is a formal representation of a situation in which all the sentences in W are true.” NLTK Book Chapter 10

21 Logic and the NLTK >>> nltk.boolean_ops() negation - conjunction & disjunction | implication -> equivalence <-> >>> lp = nltk.LogicParser() >>> lp.parse('-(P & Q)') <NegatedExpression -(P & Q)> >>> lp.parse('P & Q') <AndExpression (P & Q)> >>> lp.parse('P | (R -> Q)') <OrExpression (P | (R -> Q))> >>> lp.parse('P <-> -- P') <IffExpression (P <-> --P)> NLTK Book Chapter 10

22 Theorem Prover 9 >>> NotFnS = lp.parse('-north_of(f, s)') >>> SnF = lp.parse('north_of(s, f)') >>> R = lp.parse('all x. all y. (north_of(x, y) -> -north_of(y, x))') >>> prover = nltk.Prover9() >>> prover.prove(NotFnS, [SnF, R]) True >>> lp = nltk.LogicParser() >>> SnF = lp.parse('SnF') >>> NotFnS = lp.parse('-FnS') >>> R = lp.parse('SnF -> -FnS') NLTK Book Chapter 10

23 Models and Values >>> val = nltk.Valuation([('P', True), ('Q', True), ('R', False)]) >>> val['P'] True >>> dom = set([]) >>> g = nltk.Assignment(dom) >>> m = nltk.Model(dom, val) >>> print m.evaluate('(P & Q)', g) >>> print m.evaluate('-(P & Q)', g) False >>> print m.evaluate('(P & R)', g) NLTK Book Chapter 10

24 First Order Logic and the NLTK
>>> tlp = nltk.LogicParser(type_check=True) >>> parsed = tlp.parse('walk(angus)') >>> parsed.argument <ConstantExpression angus> >>> parsed.argument.type e >>> parsed.function <ConstantExpression walk> >>> parsed.function.type <e,?> NLTK Book Chapter 10

25 Free Variables >>> lp = nltk.LogicParser() >>> lp.parse('dog(cyril)').free() set([]) >>> lp.parse('dog(x)').free() set([Variable('x')]) >>> lp.parse('own(angus, cyril)').free() >>> lp.parse('exists x.dog(x)').free() >>> lp.parse('((some x. walk(x)) -> sing(x))').free() >>> lp.parse('exists x.own(y, x)').free() set([Variable('y')]) NLTK Book Chapter 10

26 >>> NotFnS = lp
>>> NotFnS = lp.parse('-north_of(f, s)') >>> SnF = lp.parse('north_of(s, f)') >>> R = lp.parse('all x. all y. (north_of(x, y) -> -north_of(y, x))') >>> prover = nltk.Prover9() >>> prover.prove(NotFnS, [SnF, R]) True >>> FnS = lp.parse('north_of(f, s)') >>> prover.prove(FnS, [SnF, R]) False NLTK Book Chapter 10

27 >>> v = """. bertie => b. olive => o. cyril => c
>>> v = """ ... bertie => b ... olive => o ... cyril => c ... boy => {b} ... girl => {o} ... dog => {c} ... walk => {o, c} ... see => {(b, o), (c, b), (o, c)} ... """ >>> val = nltk.parse_valuation(v) >>> print val {'bertie': 'b', 'boy': set([('b',)]), 'cyril': 'c', 'dog': set([('c',)]), 'girl': set([('o',)]), 'olive': 'o', 'see': set([('o', 'c'), ('c', 'b'), ('b', 'o')]), 'walk': set([('c',), ('o',)])} NLTK Book Chapter 10

28 >>> fmla1 = lp. parse('girl(x) | boy(x)') >>> m
>>> fmla1 = lp.parse('girl(x) | boy(x)') >>> m.satisfiers(fmla1, 'x', g) set(['b', 'o']) >>> fmla2 = lp.parse('girl(x) -> walk(x)') >>> m.satisfiers(fmla2, 'x', g) set(['c', 'b', 'o']) >>> fmla3 = lp.parse('walk(x) -> girl(x)') >>> m.satisfiers(fmla3, 'x', g) NLTK Book Chapter 10

29 Admire Relation >>> v2 = """ ... bruce => b ... cyril => c ... elspeth => e ... julia => j ... matthew => m ... person => {b, e, j, m} ... admire => {(j, b), (b, b), (m, e), (e, m), (c, a)} ... """ >>> val2 = nltk.parse_valuation(v2) NLTK Book Chapter 10

30 >>> dom2 = val2. domain >>> m2 = nltk
>>> dom2 = val2.domain >>> m2 = nltk.Model(dom2, val2) >>> g2 = nltk.Assignment(dom2) >>> fmla4 = lp.parse('(person(x) -> exists y.(person(y) & admire(x, y)))') >>> m2.satisfiers(fmla4, 'x', g2) set(['a', 'c', 'b', 'e', 'j', 'm']) >>> fmla5 = lp.parse('(person(y) & all x.(person(x) -> admire(x, y)))') >>> m2.satisfiers(fmla5, 'y', g2) set([]) NLTK Book Chapter 10

31 Theorem9 prover arguments = [ ('(man(x) <-> (not (not man(x))))', []), ('(not (man(x) & (not man(x))))', []), ('(man(x) | (not man(x)))', []), ('(man(x) & (not man(x)))', []), ('(man(x) -> man(x))', []), ('(man(x) <-> man(x))', []), ('(not (man(x) <-> (not man(x))))', []), ('mortal(Socrates)', ['all x.(man(x) -> mortal(x))', 'man(Socrates)']), ('((all x.(man(x) -> walks(x)) & man(Socrates)) -> some y.walks(y))', []),

32 ('(all x. man(x) -> all x. man(x))', []), ('some x. all y
('(all x.man(x) -> all x.man(x))', []), ('some x.all y.sees(x,y)', []), ('some e3.(walk(e3) & subj(e3, mary))', ['some e1.(see(e1) & subj(e1, john) & some e2.(pred(e1, e2) & walk(e2) & subj(e2, mary)))']), ('some x e1.(see(e1) & subj(e1, x) & some e2.(pred(e1, e2) & walk(e2) & subj(e2, mary)))', ['some e1.(see(e1) & subj(e1, john) & some e2.(pred(e1, e2) & walk(e2) & subj(e2, mary)))']) ]


Download ppt "Lecture 20a Feature Based Grammars"

Similar presentations


Ads by Google