Semantics: Representations and Analyses

Slides:



Advertisements
Similar presentations
Problems of syntax-semantics interface ESSLLI 02 Trento.
Advertisements

March 1, 2009 Dr. Muhammed Al-Mulhem 1 ICS 482 Natural Language Processing Semantics (Chapter 17) Muhammed Al-Mulhem March 1, 2009.
Semantics (Representing Meaning)
Semantic Analysis Read J & M Chapter 15.. The Principle of Compositionality There’s an infinite number of possible sentences and an infinite number of.
Syntactic analysis using Context Free Grammars. Analysis of language Morphological analysis – Chairs, Part Of Speech (POS) tagging – The/DT man/NN left/VBD.
1 Meaning Representations Chapter 17 November 2012 Lecture #11.
Statistical NLP: Lecture 3
1 Computational Semantics Chapter 18 November 2012 We will not do all of this… Lecture #12.
CS 4705 Slides adapted from Julia Hirschberg. Homework: Note POS tag corrections. Use POS tags as guide. You may change them if they hold you back.
Meaning Representation and Semantic Analysis Ling 571 Deep Processing Techniques for NLP February 9, 2011.
Semantics: Representations and Analyses Slides adapted from Kathy McKeown, Dan Jurafsky, Chris Manning and video thanks to Lyn Walker, Michael Johnston,
Introduction to Semantics To be able to reason about the meanings of utterances, we need to have ways of representing the meanings of utterances. A formal.
CS 4705 Semantic Analysis: Syntax-Driven Semantics.
CS 4705 Semantics: Representations and Analyses. What kinds of meaning do we want to capture? Categories/entities –IBM, Jane, a black cat, Pres. Bush.
Semantics: Representations and Analyses Slides adapted from Julia Hirschberg, Dan Jurafsky, Chris Manning.
Knowledge Representation using First-Order Logic (Part II) Reading: Chapter 8, First lecture slides read: Second lecture slides read:
CS 4705 Lecture 17 Semantic Analysis: Syntax-Driven Semantics.
Final Review CS4705 Natural Language Processing. Semantics Meaning Representations –Predicate/argument structure and FOPC Thematic roles and selectional.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Fall 2005-Lecture 2.
Categories – relations or individuals? What are the differences in representing collie as a relation vs. an individual? As a relation: collie(lassie) –
CS 4705 Semantic Analysis: Syntax-Driven Semantics.
CS 4705 Lecture 11 Feature Structures and Unification Parsing.
Categories – relations or individuals? What are the differences in representing collie as a relation vs. an individual? As a relation: collie(lassie) –
February 2009Introduction to Semantics1 Logic, Representation and Inference Introduction to Semantics What is semantics for? Role of FOL Montague Approach.
BİL711 Natural Language Processing
CS 4705: Semantic Analysis: Syntax-Driven Semantics
Semantics: Representations and Analyses Slides adapted from Julia Hirschberg, Dan Jurafsky, Chris Manning.
November 2003CSA4050: Semantics I1 CSA4050: Advanced Topics in NLP Semantics I What is semantics for? Role of FOL Montague Approach.
1 Natural Language Processing Lecture Notes 11 Chapter 15 (part 1)
Semantic Analysis CMSC Natural Language Processing May 8, 2003.
SEMANTIC ANALYSIS WAES3303
Computing Science, University of Aberdeen1 CS4025: Logic-Based Semantics l Compositionality in practice l Producing logic-based meaning representations.
Computational Semantics Day 5: Inference Aljoscha.
Artificial Intelligence: Natural Language
1 Predicate (Relational) Logic 1. Introduction The propositional logic is not powerful enough to express certain types of relationship between propositions.
Linguistic Essentials
Semantic Construction lecture 2. Semantic Construction Is there a systematic way of constructing semantic representation from a sentence of English? This.
Rules, Movement, Ambiguity
Artificial Intelligence: Natural Language
Section 11.3 Features structures in the Grammar ─ Jin Wang.
Building a Semantic Parser Overnight
November 2006Semantics I1 Natural Language Processing Semantics I What is semantics for? Role of FOL Montague Approach.
◦ Process of describing the structure of phrases and sentences Chapter 8 - Phrases and sentences: grammar1.
EEL 5937 Content languages EEL 5937 Multi Agent Systems Lecture 10, Feb. 6, 2003 Lotzi Bölöni.
 My office hours will be Wednesday 2-3 this week instead of Wednesday 4-5 (Oct. 7 th )
Chapter 4 Syntax a branch of linguistics that studies how words are combined to form sentences and the rules that govern the formation of sentences.
10/31/00 1 Introduction to Cognitive Science Linguistics Component Topic: Formal Grammars: Generating and Parsing Lecturer: Dr Bodomo.
Knowledge Representation Lecture 2 out of 5. Last Week Intelligence needs knowledge We need to represent this knowledge in a way a computer can process.
Natural Language Processing Vasile Rus
An Introduction to the Government and Binding Theory
Statistical NLP: Lecture 3
Semantics (Representing Meaning)
Syntax Word order, constituency
SYNTAX.
Representations of Meaning
Language, Logic, and Meaning
Natural Language Processing
CPE 480 Natural Language Processing
CSCI 5832 Natural Language Processing
CS 4705: Semantic Analysis: Syntax-Driven Semantics
CSC 594 Topics in AI – Applied Natural Language Processing
بسم الله الرحمن الرحيم ICS 482 Natural Language Processing
Semantics September 23, /22/2018.
Semantics: Representations and Analyses
CSCI 5832 Natural Language Processing
Linguistic Essentials
Semantics: Representations and Analyses
CS4705 Natural Language Processing
Natural Language Processing
Structure of a Lexicon Debasri Chakrabarti 13-May-19.
Presentation transcript:

Semantics: Representations and Analyses

What kinds of meaning do we want to capture? Categories/entities IBM, Jane, a black cat, Pres. Bush Events running a mile, AS elected governor of CA Time Oct 30, next week, in 2 years Aspect Jack knows how to run. Jack is running. Jack ran the mile in 5 min. Beliefs, Desires and Intentions (BDI)

What Can Serve as a Meaning Representation? Anything that allows us to Answer questions (What is the tallest building in the world?) Determine truth (Is the blue block on the red block?) Draw inferences (If the blue block is on the red block and the red block is on the tallest building in the world, then the blue block is on the tallest building in the world)

Meaning Representations All represent ‘linguistic meaning’ of I have a car and state of affairs in some world All consist of structures, composed of symbols representing objects and relations among them FOPC: Semantic Net: having haver had-thing speaker car

Conceptual Dependency Diagram: Car  Poss-By Speaker Frame Having Haver: S HadThing: Car

A Standard Representation: Predicate-Argument Structure Represents concepts and relationships among them Nouns as concepts or arguments (red(ball)) Adjectives, adverbs, verbs as predicates (red(ball)) Subcategorization (or, argument) frames specify number, position, and syntactic category of arguments NP likes NP NP likes Inf-VP NP likes NP Inf-VP

Semantic (Thematic) Roles Subcat frames link arguments in surface structure with their semantic roles Agent: George hit Bill. Bill was hit by George. Patient: George hit Bill. Bill was hit by George. Selectional Restrictions: constraints on the types of arguments verbs take George assassinated the senator. *The spider assassinated the fly. assassinate: intentional (political?) killing

First Order Predicate Calculus Not ideal as a meaning representation and doesn't do everything we want -- but better than many… Supports the determination of truth Supports compositionality of meaning Supports question-answering (via variables) Supports inference

NL Mapping to FOPC Terms: constants, functions, variables Constants: objects in the world, e.g. Huey Functions: concepts, e.g. sisterof(Huey) Variables: x, e.g. sisterof(x) Predicates: symbols that refer to relations that hold among objects in some domain or properties that hold of some object in a domain likes(Huey, kibble) cat(Huey)

Logical connectives permit compositionality of meaning kibble(x)  likes(Huey,x) “Huey likes kibble” cat(Vera) ^ odd(Vera) “Vera is an odd cat” sleeping(Huey) v eating(Huey) “Huey either is sleeping or eating” Sentences in FOPC can be assigned truth values Atomic formulae are T or F based on their presence or absence in a DB (Closed World Assumption?) Composed meanings are inferred from DB and meaning of logical connectives

Limitations: cat(Huey) sibling(Huey,Vera) cat(Huey) ^ sibling(Huey,Vera)  cat(Vera) Limitations: Do ‘and’ and ‘or’ in natural language really mean ‘^’ and ‘v’? Mary got married and had a baby. And then… Your money or your life! Does ‘’ mean ‘if’? If you go, I’ll meet you there. How do we represent other connectives? She was happy but ignorant.

Quantifiers: Existential quantification: There is a unicorn in my garden. Some unicorn is in my garden. Universal quantification: The unicorn is a mythical beast. Unicorns are mythical beasts. Many? A few? Several? A couple?

Temporal Representations How do we represent time and temporal relationships between events? Last year Martha Stewart was happy but soon she will be in prison. Where do we get temporal information? Verb tense Temporal expressions Sequence of presentation Linear representations: Reichenbach ‘47

Utterance time (U): when the utterance occurs Reference time (R): the temporal point-of-view of the utterance Event time (E): when events described in the utterance occur George is eating a sandwich. -- E,R,U  George had eaten a sandwich (when he realized…) E – R – U  George will eat a sandwich. --U,R – E  While George was eating a sandwich his mother arrived.

Verbs and Event Types: Aspect Statives: states or properties of objects at a particular point in time I am hungry. Activities: events with no clear endpoint I am eating. Accomplishments: events with durations and endpoints that result in some change of state I ate dinner. Achievements: events that change state but have no particular duration – they occur in an instant I got the bill.

Beliefs, Desires and Intentions Very hard to represent internal speaker states like believing, knowing, wanting, assuming, imagining Not well modeled by a simple DB lookup approach so.. Truth in the world vs. truth in some possible world George imagined that he could dance. George believed that he could dance. Augment FOPC with special modal operators that take logical formulae as arguments, e.g. believe, know

Mutual belief: I believe you believe I believe…. Believes(George, dance(George)) Knows(Bill,Believes(George,dance(George))) Mutual belief: I believe you believe I believe…. Practical importance: modeling belief in dialogue Clark’s grounding

Compositional Semantics The meaning of the whole is made up of the meaning of its parts (predicates and arguments) George cooks. Dan eats. Dan is sick. Cook(George) Eat(Dan) Sick(Dan) If George cooks and Dan eats, Dan will get sick. (Cook(George) ^ eat(Dan))  Sick(Dan) Sick(Dan)  Cook(George) Syntax tells us how to combine the parts

Syntax-Driven Semantics NP VP eat(Dan) Nom V N Dan eats Task: can we link up syntactic structures to a corresponding semantic representation to produce the ‘meaning’ structure of a sentence in the course of parsing it?

Rule-to-Rule Hypothesis We don’t want to have to specify for every possible parse tree what semantic representation it maps to We want to identify general mappings from parse trees to semantic representations: Hypothesis: A mapping exists between rules of the grammar and rules of semantic representation

Semantic Attachments Extend each grammar rule with instructions on how to map the components of the rule to a semantic representation S  NP VP {VP.sem(NP.sem)} Each semantic function is defined in terms of the semantic representation of choice Problem: how to define these functions and how to specify their composition so we always get the meaning representation we want from our grammar?

A ‘Simple’ Example AyCaramba serves meat. Associating constants with arguments ProperNoun  AyCaramba {AyCaramba} MassNoun  meat {Meat} Defining functions to produce these from input NP  ProperNoun {ProperNoun.sem} NP  MassNoun {MassNoun.sem} Assumption: meaning reps of children are passed up to parents for non-branching constuents Verbs here are where the action is

How do we combine these pieces? V  serves {E(e,x,y) Isa(e,Serving) ^ Server(e,x) ^ Served(e,y)} Will every verb need its own distinct representation? George served in the army How do we combine these pieces? VP  V NP Goal: E(e,x) Isa(e,Serving) ^ Server(e,x) ^ Served(e,Meat) VP semantics must tell us Which vars to be replaced by which args? How this replacement is done?

Non-Compositional Language What do we do with language whose meaning isn’t derived from the meanings of its parts Metaphor: You’re the cream in my coffee. She’s the cream in George’s coffee. The break-in was just the tip of the iceberg. This was only the tip of Shirley’s iceberg. Idioms: The old man finally kicked the bucket. The old man finally kicked the proverbial bucket.

Problems with Syntactic-Driven Semantics Syntactic structures often don’t fit semantic structures very well Important semantic elements often distributed very differently in trees for sentences that mean ‘the same’ I like soup. Soup is what I like. Parse trees contain many structural elements not clearly important to making semantic distinctions Syntax driven semantic representations are sometimes pretty bizarre

Sum Many hard problems in full semantic representation: Temporal relations: tense, aspect BDI Current representations impoverished in many respects Next time: Read Ch 16