Kees van Deemter Matthew Stone Formal Issues in Natural Language Generation Lecture 4 Shieber 1993; van Deemter 2002.

Slides:



Advertisements
Similar presentations
Some common assumptions behind Computational Generation of Referring Expressions (GRE) (Introductory remarks at the start of the workshop)
Advertisements

Kees van Deemter Matthew Stone Formal Issues in Natural Language Generation Lecture 5 Stone, Doran,Webber, Bleam & Palmer.
Generation of Referring Expressions: the State of the Art SELLC Summer School, Harbin 2010 Kees van Deemter Computing Science University of Aberdeen.
Charting the Potential of Description Logic for the Generation of Referring Expression SELLC, Guangzhou, Dec Yuan Ren, Kees van Deemter and Jeff.
Kees van Deemter Matthew Stone Formal Issues in Natural Language Generation Lecture 1 Overview + Reiter & Dale 1997.
Vagueness: a problem for AI Kees van Deemter University of Aberdeen Scotland, UK.
CS4018 Formal Models of Computation weeks Computability and Complexity Kees van Deemter (partly based on lecture notes by Dirk Nikodem)
Computer Science CPSC 322 Lecture 25 Top Down Proof Procedure (Ch 5.2.2)
ARCHITECTURES FOR ARTIFICIAL INTELLIGENCE SYSTEMS
Dana Nau: Lecture slides for Automated Planning Licensed under the Creative Commons Attribution-NonCommercial-ShareAlike License:
Kaplan’s Theory of Indexicals
Deductive Arguments: Categorical Logic
Propositional Logic Reading: C , C Logic: Outline Propositional Logic Inference in Propositional Logic First-order logic.
SEMANTICS.
Logic.
Developing writing skills meaningfully COHERENCE AND COHESION.
Albert Gatt LIN1180/LIN5082 Semantics Lecture 2. Goals of this lecture Semantics -- LIN 1180 To introduce some of the central concepts that semanticists.
Knowledge Representation Methods
CPSC 322, Lecture 19Slide 1 Propositional Logic Intro, Syntax Computer Science cpsc322, Lecture 19 (Textbook Chpt ) February, 23, 2009.
CSCI 5582 Fall 2006 CSCI 5582 Artificial Intelligence Lecture 9 Jim Martin.
Logical Agents Chapter 7. Why Do We Need Logic? Problem-solving agents were very inflexible: hard code every possible state. Search is almost always exponential.
Logical Agents Chapter 7 Feb 26, Knowledge and Reasoning Knowledge of action outcome enables problem solving –a reflex agent can only find way from.
Formal Specification Thomas Alspaugh ICS Nov 7.
Lecture 12: 22/6/1435 Natural language processing Lecturer/ Kawther Abas 363CS – Artificial Intelligence.
Farrar on Ontologies for NLP Fourth w/s on multimodal semantic representation, Tilburg Jan 2005.
Copyright © Cengage Learning. All rights reserved. CHAPTER 4 ELEMENTARY NUMBER THEORY AND METHODS OF PROOF ELEMENTARY NUMBER THEORY AND METHODS OF PROOF.
LDK R Logics for Data and Knowledge Representation Modeling First version by Alessandro Agostini and Fausto Giunchiglia Second version by Fausto Giunchiglia.
HCI in Software Process Material from Authors of Human Computer Interaction Alan Dix, et al.
THE BIG PICTURE Basic Assumptions Linguistics is the empirical science that studies language (or linguistic behavior) Linguistics proposes theories (models)
(CSC 102) Lecture 7 Discrete Structures. Previous Lectures Summary Predicates Set Notation Universal and Existential Statement Translating between formal.
110/19/2015CS360 AI & Robotics AI Application Areas  Neural Networks and Genetic Algorithms  These model the structure of neurons in the brain  Humans.
Pattern-directed inference systems
Slide 1 Propositional Definite Clause Logic: Syntax, Semantics and Bottom-up Proofs Jim Little UBC CS 322 – CSP October 20, 2014.
ARTIFICIAL INTELLIGENCE [INTELLIGENT AGENTS PARADIGM] Professor Janis Grundspenkis Riga Technical University Faculty of Computer Science and Information.
LOGIC AND ONTOLOGY Both logic and ontology are important areas of philosophy covering large, diverse, and active research projects. These two areas overlap.
Semantic web course – Computer Engineering Department – Sharif Univ. of Technology – Fall Description Logics: Logic foundation of Semantic Web Semantic.
The Reality of Logic David Davenport Computer Eng. Dept., Bilkent University, Ankara Turkey.
Meaning. Deictics  Are words, phrases and features of grammar that have to be interpreted in relation to the situation in which they are uttered such.
Albert Gatt LIN3021 Formal Semantics Lecture 4. In this lecture Compositionality in Natural Langauge revisited: The role of types The typed lambda calculus.
LECTURE 2: SEMANTICS IN LINGUISTICS
Simultaneously Learning and Filtering Juan F. Mancilla-Caceres CS498EA - Fall 2011 Some slides from Connecting Learning and Logic, Eyal Amir 2006.
Lecture 1 Lec. Maha Alwasidi. Branches of Linguistics There are two main branches: Theoretical linguistics and applied linguistics Theoretical linguistics.
Logical Agents Chapter 7. Outline Knowledge-based agents Logic in general Propositional (Boolean) logic Equivalence, validity, satisfiability.
SEMANTICS VS PRAGMATICS Semantics is the study of the relationships between linguistic forms and entities in the world; that is how words literally connect.
Topic and the Representation of Discourse Content
Jette Viethen 20 April 2007NLGeval07 Automatic Evaluation of Referring Expression Generation is Possible.
Computer Science CPSC 322 Lecture 22 Logical Consequences, Proof Procedures (Ch 5.2.2)
Lecture 2 (Chapter 2) Introduction to Semantics and Pragmatics.
FIDELITY IN TRANSLATION AND INTERPRETATION PLAN 1.Fidelity as a phenomenon in translation 2.Verbalizing a simple idea 3.Principles of fidelity 3.1. Primary.
Logic: Proof procedures, soundness and correctness CPSC 322 – Logic 2 Textbook §5.2 March 7, 2011.
Lecture 10 Semantics Sentence Interpretation. The positioning of words and phrases in syntactic structure helps determine the meaning of the entire sentence.
LDK R Logics for Data and Knowledge Representation Propositional Logic Originally by Alessandro Agostini and Fausto Giunchiglia Modified by Fausto Giunchiglia,
1.4 Properties of Real Numbers ( )
Charting the Potential of Description Logic for the Generation of Referring Expression SELLC, Guangzhou, Dec Yuan Ren, Kees van Deemter and Jeff.
Formal Semantics Purpose: formalize correct reasoning.
EEL 5937 Content languages EEL 5937 Multi Agent Systems Lecture 10, Feb. 6, 2003 Lotzi Bölöni.
Kees van Deemter Generation of Referring Expressions: a crash course Background information and Project HIT 2010.
NLP. Introduction to NLP What is the meaning of: (5+2)*(4+3)? Parse tree N N N N + + E E E E F F* FE E E 49.
Artificial Intelligence Knowledge Representation.
Logics for Data and Knowledge Representation ClassL (part 1): syntax and semantics.
Artificial Intelligence Logical Agents Chapter 7.
Logical Agents. Inference : Example 1 How many variables? 3 variables A,B,C How many models? 2 3 = 8 models.
Knowledge and reasoning – second part
ARTIFICIAL INTELLIGENCE
Language, Logic, and Meaning
EA C461 – Artificial Intelligence Logical Agent
HCC class lecture 13 comments
Knowledge and reasoning – second part
Logics for Data and Knowledge Representation
Representations & Reasoning Systems (RRS) (2.2)
Presentation transcript:

Kees van Deemter Matthew Stone Formal Issues in Natural Language Generation Lecture 4 Shieber 1993; van Deemter 2002

Semantics Formal semantics concentrates on information content and its representation. To what extent does good NLG depend on the right information? To what extent does good NLG depend on the right representation? Note: GRE, but also more general.

Information in NLG Logical space: all the ways things could turn out to be

Information in NLG Logical space: all the ways things could turn out to be John ate nothing. John ate the cake (C). John ate B+C. John ate A+C. John ate the banana (B). John ate the apple (A). John ate A+B. John ate A, B+C.

A proposition - information Identifies particular cases as real possibilities

For example John ate nothing. John ate the cake (C). John ate B+C. John ate A+C. John ate the banana (B). John ate the apple (A). John ate A+B. John ate A, B+C. Here is a particular proposition.

A wrinkle Computer systems get their knowledge of logical space, common ground, etc. from statements in formal logic. Lots of formulas can carry the same information.

For example John ate nothing. John ate the cake (C). John ate B+C. John ate A+C. John ate the banana (B). John ate the apple (A). John ate A+B. John ate A, B+C. ABC ABC

For example John ate nothing. John ate the cake (C). John ate B+C. John ate A+C. John ate the banana (B). John ate the apple (A). John ate A+B. John ate A, B+C. AB AB

For example John ate nothing. John ate the cake (C). John ate B+C. John ate A+C. John ate the banana (B). John ate the apple (A). John ate A+B. John ate A, B+C. (A B) (A B)

For example John ate nothing. John ate the cake (C). John ate B+C. John ate A+C. John ate the banana (B). John ate the apple (A). John ate A+B. John ate A, B+C. F (A B)

Shieber 1993 The problem of logical form equivalence is about how you get this representation. In general, an algorithm can choose this representation in one of two ways: In a reasoner that does general, non-grammatical inference. Using at least some grammatical knowledge.

Shieber 1993 If it is chosen without access to the grammar (modularly) then the surface realizer has to know what logical formulas mean the same. This is intractable, philosophically, because the notion is impossible to pin down and computationally, because our best attempts are not computable.

What about GRE? Arguably, GRE uses a grammar. –Parameters such as the preference order on properties reflect knowledge of how to communicate effectively. –Decisions about usefulness or completeness of a referring expression reflect beliefs about utterance interpretation. Maybe this is a good idea for NLG generally.

Letting grammar fix representation Choice of alternatives reflects linguistic notions – discourse coherence, information structure, function. ABC ABC AB AB (A B) (A B) F (A B)

Now theres a new question If grammar is responsible for how information is represented, where does the information itself come from? To answer, lets consider information and communication in more detail.

Information in NLG Logical space: all the ways things could turn out to be

Information in NLG Common ground: the possibilities mutual knowledge still leaves open.

Information in NLG John ate nothing. John ate the cake (C). John ate B+C. John ate A+C. John ate the banana (B). John ate the apple (A). John ate A+B. John ate A, B+C. Common ground: the possibilities mutual knowledge still leaves open.

Information in NLG Private knowledge: the things you take as possible.

Information in NLG John ate nothing. John ate the cake (C). John ate B+C. John ate A+C. John ate the banana (B). John ate the apple (A). John ate A+B. John ate A, B+C. Private knowledge: the things you take as possible.

Information in NLG Communicative Goal: an important distinction that should go on the common ground.

Information in NLG John ate nothing. John ate the cake (C). John ate B+C. John ate A+C. John ate the banana (B). John ate the apple (A). John ate A+B. John ate A, B+C. Communicative Goal: an important distinction that should go on the common ground.

Formal question What information satisfies what communicative goals? Objective: modularity general reasoning gives communicative goals, grammar determines information. Another meaty issue.

Information in NLG John ate nothing. John ate the cake (C). John ate B+C. John ate A+C. John ate the banana (B). John ate the apple (A). John ate A+B. John ate A, B+C. Communicative Goal: an important distinction that should go on the common ground.

For example John ate nothing. John ate the cake (C). John ate B+C. John ate A+C. John ate the banana (B). John ate the apple (A). John ate A+B. John ate A, B+C. What John ate was a piece of fruit.

For example John ate nothing. John ate the cake (C). John ate B+C. John ate A+C. John ate the banana (B). John ate the apple (A). John ate A+B. John ate A, B+C. John didnt eat the cake.

For example John ate nothing. John ate the cake (C). John ate B+C. John ate A+C. John ate the banana (B). John ate the apple (A). John ate A+B. John ate A, B+C. John ate one thing.

For example John ate nothing. John ate the cake (C). John ate B+C. John ate A+C. John ate the banana (B). John ate the apple (A). John ate A+B. John ate A, B+C. John ate at most one thing.

For example John ate nothing. John ate the cake (C). John ate B+C. John ate A+C. John ate the banana (B). John ate the apple (A). John ate A+B. John ate A, B+C. What John ate was the apple.

Formal questions What information satisfies what communicative goals? Let u be the info. in the utterance. Let g be goal info. Let c, p be info. in common ground, private info. u = g? p u g? c u = c g? p c u c g?

Logical form equivalence An inference problem is inevitable u = g? p u g? c u = c g? p c u c g? But the problems are very different not always as precise (entailment vs. equivalence) not always as abstract (assumptions, context, etc.) Consequences for philosophical & computational tractability.

GRE, again We can use GRE to illustrate, assuming c = domain (context set) g = set of individuals to identify represented as set of discourse refs u = identifying description represented as a conjunction of properties solution criterion c u = c g

GRE How does the algorithm choose representation of u? The algorithm finds a canonical representation of u, based on incremental selection of properties. And how does the representation and choice of u relate to the representation and choice of an actual utterance to say? The representation of u works as a sentence plan.