Discourse Representation Theory: An overview Part I: The linguistic problem.

Slides:



Advertisements
Similar presentations
Problems of syntax-semantics interface ESSLLI 02 Trento.
Advertisements

Brief Introduction to Logic. Outline Historical View Propositional Logic : Syntax Propositional Logic : Semantics Satisfiability Natural Deduction : Proofs.
Computational Semantics Aljoscha Burchardt, Alexander Koller, Stephan Walter, Universität des Saarlandes,
First-Order Logic (and beyond)
CSA4050: Advanced Topics in NLP Semantics IV Partial Execution Proper Noun Adjective.
Lexical Functional Grammar : Grammar Formalisms Spring Term 2004.
Semantics (Representing Meaning)
CSA2050: DCG IV1 CSA2050: Definite Clause Grammars IV Handling Gaps II Semantic Issues.
CAS LX 502 8a. Formal semantics Truth and meaning The basis of formal semantics: knowing the meaning of a sentence is knowing under what conditions.
Albert Gatt LIN3021 Formal Semantics Lecture 5. In this lecture Modification: How adjectives modify nouns The problem of vagueness Different types of.
LTAG Semantics on the Derivation Tree Presented by Maria I. Tchalakova.
NLP and Speech Course Review. Morphological Analyzer Lexicon Part-of-Speech (POS) Tagging Grammar Rules Parser thethe – determiner Det NP → Det.
NLP and Speech 2004 Feature Structures Feature Structures and Unification.
Introduction to Semantics To be able to reason about the meanings of utterances, we need to have ways of representing the meanings of utterances. A formal.
CS 4705 Semantic Analysis: Syntax-Driven Semantics.
Artificial Intelligence 2005/06 From Syntax to Semantics.
LING 364: Introduction to Formal Semantics Lecture 4 January 24th.
Brief Introduction to Logic. Outline Historical View Propositional Logic : Syntax Propositional Logic : Semantics Satisfiability Natural Deduction : Proofs.
CS 4705 Semantics: Representations and Analyses. What kinds of meaning do we want to capture? Categories/entities –IBM, Jane, a black cat, Pres. Bush.
CS 4705 Lecture 17 Semantic Analysis: Syntax-Driven Semantics.
CAS LX a. A notational holiday. Sets A set is a collection of entities of any kind. They can be finite: {√2, John Saeed, 1984}. They can be infinite:
June 7th, 2008TAG+91 Binding Theory in LTAG Lucas Champollion University of Pennsylvania
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Fall 2005-Lecture 2.
Semantic Composition with -DRT Christof Rumpf Heinrich-Heine-Universität Düsseldorf
PSY 369: Psycholinguistics Some basic linguistic theory part3.
Let remember from the previous lesson what is Knowledge representation
CS 4705 Semantic Analysis: Syntax-Driven Semantics.
Meaning and Language Part 1.
CAS LX 502 Semantics 2b. A formalism for meaning 2.5, 3.2, 3.6.
Models of Generative Grammar Smriti Singh. Generative Grammar  A Generative Grammar is a set of formal rules that can generate an infinite set of sentences.
VP: [VP[Vhelp[ [PRNyou]]
FIRST ORDER LOGIC Levent Tolga EREN.
February 2009Introduction to Semantics1 Logic, Representation and Inference Introduction to Semantics What is semantics for? Role of FOL Montague Approach.
CAS LX 502 8b. Formal semantics A fragment of English.
Propositional Calculus Composed of symbols –P – some true statement P might represent something like “It is Monday” or “the car is red” And sentences –a.
© Kenneth C. Louden, Chapter 11 - Functional Programming, Part III: Theory Programming Languages: Principles and Practice, 2nd Ed. Kenneth C. Louden.
November 2003CSA4050: Semantics I1 CSA4050: Advanced Topics in NLP Semantics I What is semantics for? Role of FOL Montague Approach.
1 Natural Language Processing Lecture Notes 11 Chapter 15 (part 1)
LOGIC AND ONTOLOGY Both logic and ontology are important areas of philosophy covering large, diverse, and active research projects. These two areas overlap.
Computational Semantics Day 5: Inference Aljoscha.
Artificial Intelligence: Natural Language
1 Predicate (Relational) Logic 1. Introduction The propositional logic is not powerful enough to express certain types of relationship between propositions.
Semantic Construction lecture 2. Semantic Construction Is there a systematic way of constructing semantic representation from a sentence of English? This.
© Kenneth C. Louden, Chapter 11 - Functional Programming, Part III: Theory Programming Languages: Principles and Practice, 2nd Ed. Kenneth C. Louden.
Albert Gatt LIN3021 Formal Semantics Lecture 4. In this lecture Compositionality in Natural Langauge revisited: The role of types The typed lambda calculus.
LECTURE 2: SEMANTICS IN LINGUISTICS
Key Concepts Representation Inference Semantics Discourse Pragmatics Computation.
Programming Languages and Design Lecture 3 Semantic Specifications of Programming Languages Instructor: Li Ma Department of Computer Science Texas Southern.
Rules, Movement, Ambiguity
Computational Semantics Day II: A Modular Architecture
Dr. Rogelio Dávila Pérez Formal Semantics for Natural language.
November 2006Semantics I1 Natural Language Processing Semantics I What is semantics for? Role of FOL Montague Approach.
LDK R Logics for Data and Knowledge Representation First Order Logics (FOL) Originally by Alessandro Agostini and Fausto Giunchiglia Modified by Fausto.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Spring 2006-Lecture 2.
ece 627 intelligent web: ontology and beyond
CAS LX b. Summarizing the fragment analysis, relative clauses.
LDK R Logics for Data and Knowledge Representation Propositional Logic Originally by Alessandro Agostini and Fausto Giunchiglia Modified by Fausto Giunchiglia,
Working with Discourse Representation Theory Patrick Blackburn & Johan Bos Lecture 2 Building Discourse Representation Structures.
CAS LX 502 9b. Formal semantics Pronouns and quantifiers.
Albert Gatt LIN3021 Formal Semantics Lecture 3. Aims This lecture is divided into two parts: 1. We make our first attempts at formalising the notion of.
ARTIFICIAL INTELLIGENCE [INTELLIGENT AGENTS PARADIGM] Professor Janis Grundspenkis Riga Technical University Faculty of Computer Science and Information.
Lecture 041 Predicate Calculus Learning outcomes Students are able to: 1. Evaluate predicate 2. Translate predicate into human language and vice versa.
Meaning and Language Part 1. Plan We will talk about two different types of meaning, corresponding to two different types of objects: –Lexical Semantics:
SYNTAX.
Semantics (Representing Meaning)
Language, Logic, and Meaning
Are deictic and anaphoric uses distinguishable?
CSCI 5832 Natural Language Processing
Logics for Data and Knowledge Representation
Natural Language Processing
Presentation transcript:

Discourse Representation Theory: An overview Part I: The linguistic problem

DRT: An overview - Part I: The linguistic problem2 Outline 1. A meaning representation language 2. Compositionality 3. Lambda Calculus 4. Moving to Discourse 5. Shortcomings with FOL

DRT: An overview - Part I: The linguistic problem3 1. A meaning representation language

DRT: An overview - Part I: The linguistic problem4 A semantic analysis of NL should answer at least the following questions:  What does a given sentence mean?  How is the meaning representation of a sentence built?  How do we infer some piece of information out of another?  How is the meaning representation of a discourse built? This boils down to have meaning representation of the linguistic utterances.

DRT: An overview - Part I: The linguistic problem5 Why First Order Logic? Apparently all natural languages show a predicate argument structure: Is there anything shared by these two sentences? 1. “John loves Mary” 2. “Mary loves John” They have a not so overlapping linear order, but they seem strictly related all the same. A philosophical argument: Assume a speaker S has a meaning representation for both proper names: would it be possible for S to understand 1. and not 2.? If so, how does it come that a speaker could grasp the meaning of infinite sentences (he never heard before) out of finite resources (his lexicon)? Loves(john, mary) Loves(mary, john) Loves(x,y)

DRT: An overview - Part I: The linguistic problem6 What does a given sentence mean? A theoretical assumption: “The meaning of a sentence is its truth value”: (better, its truth conditions, i.e. the state of affairs of the world which make the sentence true)  FOL supports a reliable concept of satisfaction and truth on models  if we can translate a NL sentence S in a FOL formula Φ (that is called the “proposition” corresponding to S), then we have description of its meaning in terms of truth in a model M=.  “John loves Mary”  loves(john,mary) M ╞ Φ iff  I(loves)  “A student loves Mary”   x(student(x)  loves(x,Mary)) M ╞ Φ iff for any variable assignment g, M,g╞  x(student(x)  loves(x,Mary)) iff for some x-variant g’ of g M,g’╞ student(x)  loves(x,Mary)

DRT: An overview - Part I: The linguistic problem7 We can draw inferences from the meaning of the sentences we understand We are told that: “Rudy’s is a vegetarian restaurant” And that: “vegetarian restaurants don’t serve meat” In FOL we have a reliable way to conclude “Rudy’s doesn’t serve meat” using an inference rule instantiated by: VegRest(rudys)  x(VegRest(x)  ¬Serve(x,meat)) ____________________________ ¬Serve(rudys,meat)

DRT: An overview - Part I: The linguistic problem8 So, does FOL capture the what, the content of Natural Language semantics? It’s quite controversial. Adverbs semantics:  “Milly swims slowly”. MODIFICATION OF VERBS  HIGHER ORDER PREDICATION Adjective semantics:  There is a red apple”   x(apple(x)  red(x)) INTERSECTION OF CLASSES  “There is a small elephant in the zoo”   x(elephant(x)  small(x)  in_the_zoo(x)) NO!  “John is a skillful violinist”  skillful(john)  violinist(john) NO! NON-INTERSECTIVE ADJECTIVES Indexical expressions semantics:  “I heard a noise coming from behind me” HOW DO WE MAP “BEHIND” ON A (SUBJECT-INDEPENDENT) MODEL? Etc.

DRT: An overview - Part I: The linguistic problem9 Even though FOL could capture the what of meaning, in itself it doesn’t say anything about how to build FOL formulas starting from sentences, and how to do it in a systematic way: Take for example: “John loves Mary”  loves(john,mary) It seems that: “John” contributes the constant john “Mary” contributes the constant mary “loves” contribute the binary relation loves(x,y) So, sentence meaning flows from lexicon, from words. But how it is precisely built? Why couldn’t we derive loves(mary,john) as well? The missing point is syntactic structure!

DRT: An overview - Part I: The linguistic problem10 Here there is a hierarchy, not just a linear order of words, and it constrains the way we can fill the argument slots in the meaning representation of the relation LOVE(x,y). But notice that there is not specific mechanism defined on meaning representations that allows this unification. So, we need a syntax-driven semantics for Natural Language, such that: the lexical items in a sentence give the basic ingredients for meaning representations; syntactic structure tells us how the semantic contributions of the parts are to be joined together.

DRT: An overview - Part I: The linguistic problem11 2. Compositionality

DRT: An overview - Part I: The linguistic problem12 Principle of Compositionality “The meaning of a complex expression is a function of the meanings of its parts and of the syntactic rules by which they are combined”. In order to instantiate it we need to define: Semantic primitives for lexical entries An independent syntactic theory contributing a notion of “parts” and “rules” of combination  we’ll assume a suitable CFG and parser… a description of the “function” or mapping from syntax to semantics.

DRT: An overview - Part I: The linguistic problem13 5. Lambda Calculus

DRT: An overview - Part I: The linguistic problem14 We extend FOL with a variable binding operator. Basic expressions of the kind: xP(x) where x is a formal parameter variable and P(x) is a FOL expression containing x.  Semantics: Occurrences of variables buond by are place-holders for missing information. We abstract over the bound variables to mark the slots for substitution. -reduction (also called  -conversion) is a basic mechanism that applies - expressions to terms yielding new expressions with all occurrences of formal parameters bound to the specified terms: x.P(x)(a)  P(a) x.left(x)(john)  left(john)  Semantics: Substitute the argument term for all the occurrences of the bound variables in the functor

DRT: An overview - Part I: The linguistic problem15 Lexical representations :  PN: “John”  j  IV: “walks”  x.walks(x)  Det: “a”  X. Y.  z.X(z)  Y(z)  TV: “loves”  x. y.loves(x,y) We have now suitable representations to augment CFG with semantic attachments such that, for each CF rule, we have a function F to build the meaning of the symbols on the left out of the meanings of the symbols on the right: A  a1,…,an S(A) = {F(S(a1),…,S(an))} NP  pn S(NP) = {S(pn)} NP  det NP S(NP) = {S(det)S(NP)} e.g. “a man” ( X. Y.  z.X(z)  Y(z))( x.man(x))  Y.  z.Man(z)  Y(z)

DRT: An overview - Part I: The linguistic problem16 John loves Mary (S) loves(j,m) John (NP) j loves Mary (VP) x.loves(x,m) Mary (PN) m Mary(NP) m loves (TV) y. x.loves(x,y) John (PN) j We needed to know: exactly which variables in the TV’s meaning representation have to be replaced by the semantics of the TV’s arguments  we did it changing the semantic attachment of TV in a -expression which make variables externally available for binding how to replace them   -conversion

DRT: An overview - Part I: The linguistic problem17 6. Moving to Discourse

DRT: An overview - Part I: The linguistic problem18 Up to now, we addressed the meaning of linguistic unities up to sentences: A discourse = def a set of collocated, related of sentences NOTICE: We’ll focus only on semantics of “Monologue” (Dialog might imply a much more complex interaction...) “John went to Bill’s car dealership to check out an Acura Integra. He looked at it for about an hour”. The semantics of the second sentence is not independent on the one of the first We have to look back to the semantics of preceding sentences to assign a meaning to the two pronouns We find expressions back in the discourse which have been made contextually relevant

DRT: An overview - Part I: The linguistic problem19 We are focusing on a subset of discourse semantics, i.e. co- reference Some terminology: Reference = def The process by which expressions (noun phrases, proper names, pronouns, etc., also called “referring expressions”) denote, i.e. point to an entity (e.g. a person), called the referent. Notice: we are assuming a suitable theory of reference and will be interested in modeling the phaenomenon of co-reference in discourse Anaphora = def reference to an entity (called an “antecedent”) which has been previously introduced into the discourse. E.g.: “Mia is a woman. She loves Vincent”.

DRT: An overview - Part I: The linguistic problem20 “John went to Bill’s car dealership to check out an Acura Integra. He looked at it for about an hour”. 1.A method for building a Discourse Model that evolves with the dynamically-changing discourse it represents 2.A method for mapping between referring expression in discourse and entities in the Discourse Model FUNCTIONAL REQUIREMENTS FOR CO-REFENCE:

DRT: An overview - Part I: The linguistic problem21 7. Shortcomings with FOL

DRT: An overview - Part I: The linguistic problem22 Succeding co-reference: 1. Mia is a woman. She loves Vincent. 2. A woman snorts. She collapses. Failing coreference: 3. Every woman snorts. ?She collapses. 4. Mia didn’t order a two dollar sandwich. ?Vincent tasted it.

DRT: An overview - Part I: The linguistic problem23 1. “Mia is a woman. She loves Vincent”. FOL Representations: First attempt: WOMAN(Mia)  LOVE(x, Vincent) Post-processing: WOMAN(Mia)  LOVE(Mia, Vincent) WHAT’S THE RULE?

DRT: An overview - Part I: The linguistic problem24 2. A woman snorts. She collapses. FOL representations:  z(WOMAN(z)  SNORT(z))  COLLAPSE(X).  z(WOMAN(z)  SNORT(z)  COLLAPSE(z)). AND WHAT’S THE RULE HERE?

DRT: An overview - Part I: The linguistic problem25 3. Every woman snorts. She collapses. FOL Representations:  z(WOMAN(z)  SNORT(z))  COLLAPSE(x)  z((WOMAN(z)  SNORT(z))  COLLAPSE(z)) Why doesn’t it work?

DRT: An overview - Part I: The linguistic problem26 It seems that we should move to representations which: keep the truth-conditional dimension of meaning typical of FOL  we want to be able to evaluate discourses on standard models as we did for sentences resolve anaphora in a systematic way mirror in some way the process of discourse understanding with the functional requirements we described