COMP 4060 Natural Language Processing Semantics. Semantics Semantics I  General Introduction  Types of Semantics  From Syntax to Semantics Semantics.

Slides:



Advertisements
Similar presentations
March 1, 2009 Dr. Muhammed Al-Mulhem 1 ICS 482 Natural Language Processing Semantics (Chapter 17) Muhammed Al-Mulhem March 1, 2009.
Advertisements

Computational language: week 10 Lexical Knowledge Representation concluded Syntax-based computational language Sentence structure: syntax Context free.
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
Natural Language Processing Lecture 2: Semantics.
Semantics (Representing Meaning)
Natural Language Processing - Parsing 1 - Language, Syntax, Parsing Problems in Parsing Ambiguity, Attachment / Binding Bottom vs. Top Down Parsing.
Language and Cognition Colombo, June 2011 Day 2 Introduction to Linguistic Theory, Part 4.
Statistical NLP: Lecture 3
Interlingua-based MT Interlingua-based Machine Translation Syntactic transfer-based MT – Couples the syntax of the two languages What if we abstract.
LING NLP 1 Introduction to Computational Linguistics Martha Palmer April 19, 2006.
CS 4705 Slides adapted from Julia Hirschberg. Homework: Note POS tag corrections. Use POS tags as guide. You may change them if they hold you back.
Natural Language Processing - Feature Structures - Feature Structures and Unification.
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Semantics.
Artificial Intelligence 2004 Natural Language Processing - Syntax and Parsing - Language, Syntax, Parsing Problems in Parsing Ambiguity, Attachment.
NLP and Speech 2004 Semantics I Semantics II
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
NLP and Speech Course Review. Morphological Analyzer Lexicon Part-of-Speech (POS) Tagging Grammar Rules Parser thethe – determiner Det NP → Det.
NLP and Speech 2004 Feature Structures Feature Structures and Unification.
Meaning Representation and Semantic Analysis Ling 571 Deep Processing Techniques for NLP February 9, 2011.
Introduction to Semantics To be able to reason about the meanings of utterances, we need to have ways of representing the meanings of utterances. A formal.
CS 4705 Semantic Analysis: Syntax-Driven Semantics.
Amirkabir University of Technology Computer Engineering Faculty AILAB Efficient Parsing Ahmad Abdollahzadeh Barfouroush Aban 1381 Natural Language Processing.
Artificial Intelligence 2005/06 From Syntax to Semantics.
CS 4705 Lecture 17 Semantic Analysis: Syntax-Driven Semantics.
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
Features and Unification
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Fall 2005-Lecture 2.
CS 4705 Semantic Analysis: Syntax-Driven Semantics.
Artificial Intelligence 2004 Natural Language Processing - Syntax and Parsing - Language Syntax Parsing.
February 2009Introduction to Semantics1 Logic, Representation and Inference Introduction to Semantics What is semantics for? Role of FOL Montague Approach.
Feature structures and unification Attributes and values.
1 Semantics Interpretation Allen ’ s Chapter 9 J&M ’ s Chapter 15.
Chapter 15. Semantic Analysis
BİL711 Natural Language Processing
Interpreting Dictionary Definitions Dan Tecuci May 2002.
IV. SYNTAX. 1.1 What is syntax? Syntax is the study of how sentences are structured, or in other words, it tries to state what words can be combined with.
November 2003CSA4050: Semantics I1 CSA4050: Advanced Topics in NLP Semantics I What is semantics for? Role of FOL Montague Approach.
1 Natural Language Processing Lecture Notes 11 Chapter 15 (part 1)
Semantic Analysis CMSC Natural Language Processing May 8, 2003.
Computing Science, University of Aberdeen1 CS4025: Logic-Based Semantics l Compositionality in practice l Producing logic-based meaning representations.
ENGLISH SYNTAX Introduction to Transformational Grammar.
Artificial Intelligence: Natural Language
Semantic Construction lecture 2. Semantic Construction Is there a systematic way of constructing semantic representation from a sentence of English? This.
CSA2050 Introduction to Computational Linguistics Lecture 1 What is Computational Linguistics?
Natural Language - General
Section 11.3 Features structures in the Grammar ─ Jin Wang.
From Syntax to Semantics
Knowledge Representation
Artificial Intelligence 2004
CPSC 422, Lecture 27Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27 Nov, 16, 2015.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Spring 2006-Lecture 2.
Commonsense Reasoning in and over Natural Language Hugo Liu, Push Singh Media Laboratory of MIT The 8 th International Conference on Knowledge- Based Intelligent.
SYNTAX.
◦ Process of describing the structure of phrases and sentences Chapter 8 - Phrases and sentences: grammar1.
1 Some English Constructions Transformational Framework October 2, 2012 Lecture 7.
Concepts and Realization of a Diagram Editor Generator Based on Hypergraph Transformation Author: Mark Minas Presenter: Song Gu.
3.3 A More Detailed Look At Transformations Inversion (revised): Move Infl to C. Do Insertion: Insert interrogative do into an empty.
NATURAL LANGUAGE PROCESSING
September 26, : Grammars and Lexicons Lori Levin.
Lec. 10.  In this section we explain which constituents of a sentence are minimally required, and why. We first provide an informal discussion and then.
10/31/00 1 Introduction to Cognitive Science Linguistics Component Topic: Formal Grammars: Generating and Parsing Lecturer: Dr Bodomo.
Lecture – VIII Monojit Choudhury RS, CSE, IIT Kharagpur
Statistical NLP: Lecture 3
Semantics (Representing Meaning)
CSC 594 Topics in AI – Applied Natural Language Processing
Semantics September 23, /22/2018.
Natural Language - General
Linguistic Essentials
Structure of a Lexicon Debasri Chakrabarti 13-May-19.
Presentation transcript:

COMP 4060 Natural Language Processing Semantics

Semantics Semantics I  General Introduction  Types of Semantics  From Syntax to Semantics Semantics II  Desiderata for Representation  Logic-based Semantics

Semantics I

Semantics Distinguish between  surface structure (syntactic structure) and  deep structure (semantic structure) of sentences. Different forms of Semantic Representation  logic formalisms  ontology / semantic representation languages  Case Frame Structures (Filmore)  Conceptual Dependy Theory (Schank)  DL and similar KR languages  Ontologies

Semantic Representations Semantic Representation based on some kind of (formal) representation language:  Semantics Networks  Conceptual Dependency Graphs  Case Frames  Ontologies  Description Logics and similar Knowledge Rrepresentation languages

Constructing a Semantic Representation General:  Start with surface structure derived from parser.  Map surface structure to semantic structure  Use phrases as sub-structures.  Find concepts and representations for central phrases (e.g. VP, NP, then PP)  Assign phrases to appropriate roles around central concepts (e.g. bind PP into VP representation).

Ontology (Interlingua) approach  Ontology: a language-independent classification of objects, events, relations.  A Semantic Lexicon which connects lexical items to nodes (concepts) in the ontology.  An Analyzer that constructs Interlingua representations and selects appropriate one.

Semantic Lexicon  Provides a syntactic context for the appearance of the lexical item.  Provides a mapping for the lexical item to a node in the ontology (or more complex associations).  Provides connections from the syntactic context to semantic roles and constraints on these roles.

Deriving Basic Semantic Dependency (a toy example) Input: John makes tools Syntactic Analysis: catverb tensepresent subject root john catnoun-proper object root tool catnoun numberplural Deriving Basic Semantic Dependency

John-n1 syn-struc rootjohn catnoun-proper sem-struc human name john gendermale tool-n1 syn-struc roottool catn sem-struc tool Lexicon Entries for John and tool

Relevant Extract from the Specification of the Ontological Concept Used to Describe the Appropriate Meaning of make: manufacturing-activity... agenthuman themeartifact … Meaning Representation - Example make

John-n1 syn-struc rootjohn catnoun-proper sem-struc human name john gendermale tool-n1 syn-struc roottool catn sem-struc tool Relevant parts of the (appropriate senses of the) lexicon entries for John and tool

The basic semantic dependency component of the TMR for John makes tools manufacturing-activity-7 agentuman-3 themeset-1 element tool cardinality> 1 … Semantic Dependency Component

try-v3 syn-struc root try cat v subj root $var1 cat n xcomp root $var2 cat v formOR infinitive gerund sem-struc set-1element-typerefsem-1 cardinality>=1 refsem-1 semevent agent^$var1 effectrefsem-2 modality modality-typeepiteuctic modality-scoperefsem-2 modality-value< 1 refsem-2value^$var2 semevent

Constructing an Interlingua Representation For each syntactic analysis:  Access all semantic mappings and contexts for each lexical item.  Create all possible semantic representations.  Test them for coherency of structure and content.

“Why is Iraq developing weapons of mass destruction?”

Word sense disambiguation  Constraint checking  making sure the constraints imposed on context are met  Graph traversal  is-a links are inexpensive  other links are more expensive  The “cheapest” structure is the most coherent

Semantics II

Desiderata for a Semantic Representation  Verifiability – semantic representation must be compatible with knowledge (base) of the system.  Canonical Form - assign same representation to different surface expressions which have essentially the same meaning  Ambiguity and Vagueness – representation should (in relation to knowledge base or information system access etc.) be unambiguous and precise

Semantics - Connecting Words and Worlds Semantic Representation NL Input NL Output World State (KB: T-Box, A-Box) Knowledge Representation

Representation of Meaning Representation of meaning for natural language sentences:  Semantic Representation Language (in most cases) = some kind of formal language + semantic primitives  For example: First Order Predicate Logic with specific set of predicates and functions

Semantic Representations Semantic Representation based on some form of (formal) Representation Language. –Semantics Networks –Conceptual Dependency Graphs –Case Frames –Ontologies –DL and similar KR languages –First-Order Predicate Logic

Example - NL Database Access Imagine a database access using natural language, i.e. questions to the DB posed in natural language. Example: DB of courses in the CS department Pose questions like: Who is teaching Advanced AI in Fall 2008? Is John Anderson teaching this term? What is Jacky Baltes teaching this term? Who is teaching AI at the University of Winnipeg? Who is teaching an AI related course this term?

Example Story: My car was stolen two weeks ago. They found it last week. direct representation of meaning knowledgeinference

Example Primitives in logic language FOPL: my car as individual constant my_car, car_1 can make statement owns(car_1, I) about ownership of carowns(car_1, Speaker) 2-place predicate owns with one place for the object / car and one place for the owner; filled with variable or constant owns(car_1, Speaker) Someone owns car_1.  x: owns(car_1, x) I own all cars.  x: car(x)  owns(x, Speaker)

Example Primitives in logic language FOPL: stolen as predicate applied to carstolen(car_1) as event, specified with variable for event and constant for specific event stolen-event  e,x: event(e)  stolen(e,x)  x= car_1 or  e,x: event(e)  stolen(e, car_1) can make additional specifications, e.g. tense; time; location  e,x: event(e)  stolen(e, car_1)  past(e)  time(e)=UT-2weeks / time(e,UT-2weeks)  loc(e)=street#1 refers to identified street utterance time - 2 weeks event time before utterance time

Example Primitives in logic language FOPL: They found it last week. found(car_1,t)  time(t)  t=(UT-1week)

NL and Logic Levels of Representation and Transformation  direct representation of meaning  translation into logic expression  knowledge  stored information about relations etc., e.g. as rules  ontology; terminology; proper axioms  inference  gain additional information, conclusions  combine semantic representation and knowledge

Example car (my_car) stolen (my_car, t1), owns (speaker, my_car) found (police, my-car, t2) t1<t2 stolen (x, t1)  owns (y, x) and found (police, x, t2) implies has (y, x, t3) for some timepoints t1, t2, t3 with t1<t2<t3 What can you infer if you instantiate x with my_car? concrete world description general world knowledge

Reichenbach's Approach to English Tenses Fig from Jurafsky and Martin, p. 530 UTime of Utterance R Reference Time E Event Time

Example car (my_car) stolen (my_car, t1), owns (speaker, my_car) found (police, my-car, t2) t1<t2 stolen (x, t1)  owns (y, x) and found (police, x, t2) implies has (y, x, t3) for some timepoints t1, t2, t3 with t1<t2<t3 stolen(my_car, t1)  owns (speaker, my_car)  found (police, my-car, t2)  has (speaker, my_car, t3) pattern matching with variable binding: unification; inference

Example stolen (x, t1)  owns (y, x) implication? Express that, if something is stolen, the owner does not have it anymore!

Predicate-Argument Structure Verb-centered approach Thematic roles, case roles  Describe semantic structure based on verb and associated roles filled by other parts of the sentence (phrases). Representation using e.g. logic:  Transform structured input sentence (syntax!) into expression in predicate logic.  Usually based on central predicate, the verb, or equivalent, like ‘be’+ adjective etc.  Other parts of the sentence directly related to the verb go into the central predicate.

Verb Subcategorization Consider possible subcat frames of verbs. Example: 3 different kinds of want: 1.NP want NPI want money. want1 (Speaker, money) 2.NP want Inf-VPHe wants to go home. want2 (he, to_go_home) 3.NP want NP Inf-VP I want him to go away. want3 (I, him, to_go_away)

Example - Restaurant 'Maharani' Example: Restaurant 'Maharani' Maharani serves vegetarian food.Maharani serves vegetarian food. Maharani is a vegetarian restaurant.Maharani is a vegetarian restaurant. Maharani is close to ICSI.Maharani is close to ICSI. Write down logical formulas representing the three different sentences.

Logic Formalisms Lambda Calculus

Semantics - Lambda 1 Semantics - Lambda Calculus 1 Logic representations often involve Lambda-Calculus: represent central phrases (e.g. verb) as -expressions -expression is like a function which can be applied to terms -expression is like a function which can be applied to terms insert semantic representation of complement or modifier phrases etc. in place of variables  x, y: loves (x, y)FOPLsentence x y loves (x, y) -expression x y loves (x, y) -expression x y loves (x, y) (John)  y loves (John, y) x y loves (x, y) (John)  y loves (John, y)

Semantics - Lambda Calculus 2 Transform sentence into lambda-expression: “AI Caramba is close to ICSI.” specific: close-to (AI Caramba, ICSI) general:  x,y: close-to (x, y)  x=AI Caramba  y=ICSI Lambda Conversion form -expression: x y: close-to (x, y) (AI Caramba) Lambda Reduction apply -expression y: close-to (AI Caramba, y) close-to (AI Caramba, ICSI) close-to (AI Caramba, ICSI)

Semantics - Lambda Calculus 3  Lambda Expressions as basis for semantic representations  attached to words and syntactic categories in grammar rules  passed between nodes during parsing, according to grammar Example: semantics of the verb 'serve' Verb  serve { x y  e IS-A(e, Serving)  Server(e, y)  Served(e, x)} { x y  e IS-A(e, Serving)  Server(e, y)  Served(e, x)} Reification denotes the use of predicates as constants. Allows the use of "predicates over predicates", e.g. IS-A (serving, event) or IS-A (restaurant, location). IS-A (serving, event) or IS-A (restaurant, location). e: serving - event, action, verb reification

Semantics - Lambda Calculus 4 Lambda Expressions are constructed from central expression, inserting semantic representations for subject and complement phrases: Verb  serve { x y  e IS-A(e, Serving)  Server(e, y)  Served(e, x)} { x y  e IS-A(e, Serving)  Server(e, y)  Served(e, x)} Fill in appropriate expressions for y, x, e.g. Ay Caramba for y and steak for x, derived from direct NP = object NP of the sentence, and y as complement subject NP to the verb. y: restaurant - NP, subj. x: food - NP, dir. obj. e: serving - S, event

Semantics - Lambda Calculus 5 Complete semantic representation is produced by combining semantic feature structures of phrases in the sentence, according to extended grammar rules. Verb  serves { x y  e IS-A(e, Serving)  Server(e, y)  Served (e, x)} { x y  e IS-A(e, Serving)  Server(e, y)  Served (e, x)} Apply lambda-expression representing the verb semantics to semantic representations of NPs: { x y  e IS-A (e, Serving)  Server (e, y)  Served (e, x)} Successively apply the lambda-expression (for serves in the example above), filling the x-position with the semantics of the object-NP, and the y-position with the representation of the subject-NP.

Semantics - Lambda Calculus 6 Extend the grammar with semantic attachments, e.g. NP  ProperNoun {ProperNoun.sem} {ProperNoun.sem} The "base" semantic attachment is determined through access to a lexicon or an ontology. It corresponds to the concept associated with the lexical word, or in the simplest form just the lexical word. Example: Ay Caramba as individual constant or meat as (reified) concept.

Semantics - Lambda Calculus 7 Constructive Semantics During parsing, these semantic attachments are combined, according to the grammar rules, to form more complete representations, finally covering the whole sentence. Combine and pass upwards semantic attachments, e.g. S  NP VP {VP.sem {NP.sem}} VP  Verb NP {Verb.sem {NP.sem}}

Semantic Representation in BeRP Figure from Jurafsky and Martin, p Parse tree with semantic attachments for the sentence "AyCaramba serves meat."

Semantics - Lambda Calculus 8 Modifiers can be added into semantic description as part of the grammar rules, by intersection of concepts: Nominal  Adj Nominal { x. Nominal.sem(x)  IS-A(x, Adj.sem)} Example: a "cheap restaurant" x. IS-A (x, restaurant)  IS-A (x, cheap) x. IS-A (x, restaurant)  IS-A (x, cheap) Problem if intersection of concepts is misleading, e.g. "former friend". Use "modification" rule instead : Nominal  Adj Nominal { x. Nominal.sem(x)  AdjMod(x, Adj.sem)} Use rule for "cheap restaurant":  x. IS-A (x, restaurant)  AdjMod (x, cheap)  x. IS-A (x, restaurant)  AdjMod (x, cheap) where "cheap" modifies the "restaurant" in a specific way.

Semantics - Problems Problems with Modal Verbs:  apply to predicate structure (other verb)  referential opaqueness  not standard implications Example: Example: I think Joe's flight leaves at 7pm. (think (Speaker, leaves (Joe's flight, 7 pm))) Add: Add: Joe's flight is BA727. BA 727 is delayed. Add: Add: I think I go home. Problem: cannot apply predicate to formula in FOPL What does Speaker think now? Should I stay or should I go?

Parsing with Semantic Features Modified Early Algorithm. Figure 15.5, Jurafsky and Martin, p. 570.

References Jurafsky, D. & J. H. Martin, Speech and Language Processing, Prentice-Hall, (Chapters 9 and 10) Helmreich, S., From Syntax to Semantics, Presentation in the Course, November 2003.