LTAG Semantics for Questions Aleksandar Savkov. Contents Introduction Hamblin’s idea Karttunen’s upgrade Goals of the paper Scope properties of wh-phrases.

Slides:



Advertisements
Similar presentations
Problems of syntax-semantics interface ESSLLI 02 Trento.
Advertisements

Computational language: week 10 Lexical Knowledge Representation concluded Syntax-based computational language Sentence structure: syntax Context free.
The Logic of Quantified Statements
07/05/2005CSA2050: DCG31 CSA2050 Introduction to Computational Linguistics Lecture DCG3 Handling Subcategorisation Handling Relative Clauses.
Semantics (Representing Meaning)
L41 Lecture 2: Predicates and Quantifiers.. L42 Agenda Predicates and Quantifiers –Existential Quantifier  –Universal Quantifier 
Statistical NLP: Lecture 3
CAS LX 502 8a. Formal semantics Truth and meaning The basis of formal semantics: knowing the meaning of a sentence is knowing under what conditions.
Knowledge Representation Methods
Linguistic Theory Lecture 8 Meaning and Grammar. A brief history In classical and traditional grammar not much distinction was made between grammar and.
LTAG Semantics on the Derivation Tree Presented by Maria I. Tchalakova.
Reasons for looking at EQs English echo questions (EQs) are exemplary untutored constructions; however, they appear to operate contrary to the system of.
Introduction to Semantics To be able to reason about the meanings of utterances, we need to have ways of representing the meanings of utterances. A formal.
Sag et al., Chapter 4 Complex Feature Values 10/7/04 Michael Mulyar.
CS 4705 Lecture 17 Semantic Analysis: Syntax-Driven Semantics.
1 Chapter 7 Propositional and Predicate Logic. 2 Chapter 7 Contents (1) l What is Logic? l Logical Operators l Translating between English and Logic l.
June 7th, 2008TAG+91 Binding Theory in LTAG Lucas Champollion University of Pennsylvania
1 CSC 594 Topics in AI – Applied Natural Language Processing Fall 2009/ Outline of English Syntax.
Type shifting and coercion Henriëtte de Swart November 2010.
Week 9.5. Relative clauses CAS LX 522 Syntax I. Finishing up from last week… Last week, we covered wh-movement in questions like: –What i did Bill buy.
CAS LX 522 Syntax I Week 11a. Wh-movement.
Syntax Nuha AlWadaani.
 Final: This classroom  Course evaluations Final Review.
February 2009Introduction to Semantics1 Logic, Representation and Inference Introduction to Semantics What is semantics for? Role of FOL Montague Approach.
Chapter 4 Syntax Part II.
ARTIFICIAL INTELLIGENCE [INTELLIGENT AGENTS PARADIGM] Professor Janis Grundspenkis Riga Technical University Faculty of Computer Science and Information.
Tree-adjoining grammar (TAG) is a grammar formalism defined by Aravind Joshi and introduced in Tree-adjoining grammars are somewhat similar to context-free.
Continuous Discontinuity in It-Clefts Introduction Tension between the two approaches Our proposal: TAG analysis Equative it-cleft: It was Ohno who won.
IV. SYNTAX. 1.1 What is syntax? Syntax is the study of how sentences are structured, or in other words, it tries to state what words can be combined with.
1 Natural Language Processing Lecture Notes 11 Chapter 15 (part 1)
Database Management Systems, R. Ramakrishnan1 Relational Calculus Chapter 4.
Chapter 1, Part II: Predicate Logic With Question/Answer Animations.
ARTIFICIAL INTELLIGENCE Lecture 3 Predicate Calculus.
October 2004CSA4050: Semantics III1 CSA4050: Advanced Topics in NLP Semantics III Quantified Sentences.
Markus Egg, Alexander Koller, Joachim Niehren The Constraint Language for Lambda Structures Ventsislav Zhechev SfS, Universität Tübingen
Computing Science, University of Aberdeen1 CS4025: Logic-Based Semantics l Compositionality in practice l Producing logic-based meaning representations.
1 Predicate (Relational) Logic 1. Introduction The propositional logic is not powerful enough to express certain types of relationship between propositions.
Semantic Construction lecture 2. Semantic Construction Is there a systematic way of constructing semantic representation from a sentence of English? This.
Interpreting Language (with Logic)
Albert Gatt LIN3021 Formal Semantics Lecture 4. In this lecture Compositionality in Natural Langauge revisited: The role of types The typed lambda calculus.
Rules, Movement, Ambiguity
Building a Semantic Parser Overnight
1 LIN 1310B Introduction to Linguistics Prof: Nikolay Slavkov TA: Qinghua Tang CLASS 16, March 6, 2007.
Supertagging CMSC Natural Language Processing January 31, 2006.
Lecture 2 (Chapter 2) Introduction to Semantics and Pragmatics.
Natural Language Processing Slides adapted from Pedro Domingos
Subject Predicate Subject Main verb (Nominative structure) Auxiliary (link) verb.
1 First Order Logic CS 171/271 (Chapter 8) Some text and images in these slides were drawn from Russel & Norvig’s published material.
Subject Predicate Subject Predicate Auxiliary verb.
Final Review  Syntax  Semantics/Pragmatics  Sociolinguistics  FINAL will be part open book, and part closed book  Will use similar tasks as Problem.
Section 1.4. Propositional Functions Propositional functions become propositions (and have truth values) when their variables are each replaced by a value.
PPL Syntax & Formal Semantics Lecture Notes: Chapter 2.
Propositional Logic. Assignment Write any five rules each from two games which you like by using propositional logic notations.
Week 3. Clauses and Trees English Syntax. Trees and constituency A sentence has a hierarchical structure Constituents can have constituents of their own.
Introduction to Logic for Artificial Intelligence Lecture 2
Beginning Syntax Linda Thomas
Lecture 3: Functional Phrases
Statistical NLP: Lecture 3
Semantics (Representing Meaning)
Lecture 4b: Verb Processes
Chapter Eight Syntax.
CSC 594 Topics in AI – Applied Natural Language Processing
CS201: Data Structures and Discrete Mathematics I
Chapter Eight Syntax.
Linguistic Essentials
Predicates and Quantifiers
CS201: Data Structures and Discrete Mathematics I
Semantics 2: Syntax-Semantics Interface
Relational Calculus Chapter 4, Part B
Presentation transcript:

LTAG Semantics for Questions Aleksandar Savkov

Contents Introduction Hamblin’s idea Karttunen’s upgrade Goals of the paper Scope properties of wh-phrases Quantificational NPs Wh-phrases as quontifiers Multiple wh-questions Long-distance wh-dependencies Comparison to other approaches Embedded interrogatives Ambiguity in multiple wh-questions References

Introduction Hamblin’s semantics for questions Every question denotes a set of propositions expressed by its possible answers Who came? {‘Bill came.’, ‘John came.’, ‘Dan came.’…} Is it raining? {‘It’s raining.’, ‘It’s not raining.’}

Introduction Karttunen’s upgrade Every question denotes a set of propositions expressing only true answers

Introduction Goals of the paper (Romerom Kallmeyer, Babko-Malaya 2004) capture scope properties of quantificational elements within the question achieve the correct semantics for interrogatives embedded under e.g. know

Introduction Example of different scopes for wh- and non-wh-quantifiers:

Introduction Example for multiple wh- phrases:

Introduction Example for correct semantics of interrogatives:

Scope properties of wh- phrases Quantificational NPs Wh-phrases as quontifiers Multiple wh-questions Long-distance wh-dependencies Comparison to other approaches

Quantificational NPs We assume quantifiers as everybody have a multi-component set containing an auxiliary tree (contributes to the scope part) and an initial tree (contributes the predicate argument)

every(x,person(x,s0),laugh(x,s0))

disambiguation: 3 -> l3, 4 -> l1 every(x,person(x,s0),laugh(x,s0))

Wh-phrases as quantifiers Repeating the Karttunen style

Wh-phrases as quontifiers 5 -> l4, 9 -> l5, 10 -> l2, 7 -> l6, 14 -> l7, 15 -> l1 Q3:λp.p(s0) and some(x,person(x,s0), p=λs.every(y,person(y,s/s0), like(x,y,s)))

Multiple wh-questions To treat in situ wh-quantifiers correctly we need the minimal scope of any NP substitution node To achieve that we need both minimal scopes for wh- and non-wh-quantifiers We will use the feature WH for the wh- quantifier and P for the non-wh

Long-distance wh- dependencies In long distance wh-dependencies, one must make sure that the wh-quantifier scopes over all verbs in the sentence in order to provide argument for the most embedded one.

Long-distance wh- dependencies

Comparison to other approaches Karttunen style semantics Ginzburg and Sag, 2000 Our approach

Karttunen style semantics Draws distinction between wh-scope and non-wh-scope Uses different semantic types for all the relevant categories Wh-quantifiers combine with functions of type from situations to sets of propositions Thus all wh-quantifiers must scope over all non-wh-quantifiers

Ginzburg and Sag Ontological distinction between state- of-affairs (SOA) and propositions One builds propositions, questions, outcomes and facts from SOAs Non-wh-quantifiers have SOA nuclear scope and wh-quantifiers have proposition and thus the second one is wider

Our approach We use a ‘flat’ semantic framework in the style of MRS (Copestake et al.1999) Semantic contribution of the elementary and auxiliary trees is a set of formulas No type distinction can be made to which of the scope properties of wh- and non-wh-quantifiers could relate No distinction between SOA and propositions MAXS, WH and P features and feature unification are used to define appropriate scope windows.

Embedded interrogatives Unless bound by an operator situation arguments are replaced by the utterance situation In embedded interrogatives the issue is how to bind the situation variable

Embedded interrogatives

Ambiguity in multiple wh- questions Some multiple wh-questions are ambiguous Example: Who remembers where Mary keeps which book? This could be read in two different ways: 1) Bill remembers where Mary keeps which book. 2) Joe remembers where Mary keeps Aspects and Max remembers where Mary keeps Syntactic Structures

References LTAG Semantics for Questions (Romero, Kallmeyer, Babko-Malaya, 2004) Syntax and semantics for questions (Karttunen)