74.793 NLP and Speech 2004 Feature Structures Feature Structures and Unification.

Slides:



Advertisements
Similar presentations
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
Advertisements

Feature Structures and Parsing Unification Grammars Algorithms for NLP 18 November 2014.
Syntax-Semantics Mapping Rajat Kumar Mohanty CFILT.
Chapter 4 Syntax.
Language and Cognition Colombo, June 2011 Day 2 Introduction to Linguistic Theory, Part 4.
COMP 4060 Natural Language Processing Semantics. Semantics Semantics I  General Introduction  Types of Semantics  From Syntax to Semantics Semantics.
Statistical NLP: Lecture 3
Interlingua-based MT Interlingua-based Machine Translation Syntactic transfer-based MT – Couples the syntax of the two languages What if we abstract.
LING NLP 1 Introduction to Computational Linguistics Martha Palmer April 19, 2006.
Natural Language Processing - Feature Structures - Feature Structures and Unification.
1 Words and the Lexicon September 10th 2009 Lecture #3.
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Semantics.
NLP and Speech 2004 Semantics I Semantics II
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
NLP and Speech Course Review. Morphological Analyzer Lexicon Part-of-Speech (POS) Tagging Grammar Rules Parser thethe – determiner Det NP → Det.
Features & Unification Ling 571 Deep Processing Techniques for NLP January 31, 2011.
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books עיבוד שפות טבעיות - שיעור עשר Chart Parsing (cont) Features.
CS 4705 Semantic Analysis: Syntax-Driven Semantics.
Amirkabir University of Technology Computer Engineering Faculty AILAB Efficient Parsing Ahmad Abdollahzadeh Barfouroush Aban 1381 Natural Language Processing.
Artificial Intelligence 2005/06 From Syntax to Semantics.
CS 4705 Lecture 17 Semantic Analysis: Syntax-Driven Semantics.
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Fall 2005-Lecture 2.
 2003 CSLI Publications Ling 566 Oct 16, 2007 How the Grammar Works.
1 CONTEXT-FREE GRAMMARS. NLE 2 Syntactic analysis (Parsing) S NPVP ATNNSVBD NP AT NNthechildrenate thecake.
CS 4705 Semantic Analysis: Syntax-Driven Semantics.
Artificial Intelligence 2004 Natural Language Processing - Syntax and Parsing - Language Syntax Parsing.
Feature structures and unification Attributes and values.
1 Semantics Interpretation Allen ’ s Chapter 9 J&M ’ s Chapter 15.
1 CS 385 Fall 2006 Chapter 14 Understanding Natural Language (omit 14.4)
IV. SYNTAX. 1.1 What is syntax? Syntax is the study of how sentences are structured, or in other words, it tries to state what words can be combined with.
Chapter 15 Natural Language Processing (cont)
Understanding Natural Language
1 Natural Language Processing Lecture Notes 11 Chapter 15 (part 1)
11 Chapter 14 Part 1 Statistical Parsing Based on slides by Ray Mooney.
2007CLINT-LIN-FEATSTR1 Computational Linguistics for Linguists Feature Structures.
Artificial Intelligence: Natural Language
Linguistic Essentials
Semantic Construction lecture 2. Semantic Construction Is there a systematic way of constructing semantic representation from a sentence of English? This.
Rules, Movement, Ambiguity
Artificial Intelligence: Natural Language
Linguistic Theory Lecture 5 Filters. The Structure of the Grammar 1960s (Standard Theory) LexiconPhrase Structure Rules Deep Structure Transformations.
From Syntax to Semantics
Artificial Intelligence 2004
Supertagging CMSC Natural Language Processing January 31, 2006.
CPSC 422, Lecture 27Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27 Nov, 16, 2015.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Spring 2006-Lecture 2.
SYNTAX.
1 Natural Language Processing Lecture 6 Features and Augmented Grammars Reading: James Allen NLU (Chapter 4)
◦ Process of describing the structure of phrases and sentences Chapter 8 - Phrases and sentences: grammar1.
November 9, Lexicon (An Interacting Subsystem in UG) Part-I Rajat Kumar Mohanty IIT Bombay.
 2003 CSLI Publications Ling 566 Oct 17, 2011 How the Grammar Works.
NATURAL LANGUAGE PROCESSING
Welcome to the flashcards tool for ‘The Study of Language, 5 th edition’, Chapter 8 This is designed as a simple supplementary resource for this textbook,
SYNTAX.
Chapter 4 Syntax a branch of linguistics that studies how words are combined to form sentences and the rules that govern the formation of sentences.
King Faisal University جامعة الملك فيصل Deanship of E-Learning and Distance Education عمادة التعلم الإلكتروني والتعليم عن بعد [ ] 1 King Faisal University.
Natural Language Processing Vasile Rus
Grammar Grammar analysis.
Beginning Syntax Linda Thomas
Lecture – VIII Monojit Choudhury RS, CSE, IIT Kharagpur
Statistical NLP: Lecture 3
Basic Parsing with Context Free Grammars Chapter 13
Chapter Eight Syntax.
CSC 594 Topics in AI – Applied Natural Language Processing
Natural Language - General
Linguistic Essentials
Ling 566 Oct 14, 2008 How the Grammar Works.
Structure of a Lexicon Debasri Chakrabarti 13-May-19.
Artificial Intelligence 2004 Speech & Natural Language Processing
Presentation transcript:

NLP and Speech 2004 Feature Structures Feature Structures and Unification

Feature Structures - General Feature structures describe linguistic attributes or features like number, person associated with words or syntactic constituents like noun phrase. Feature structures are sets of features and values, e.g. hat[Numbersing ] buys[Person 3 ] [Numbersing ]

Feature Structures - Agreement Feature structures can be collected in one ‘variable’ called agreement. buys agreement [Person 3] [Number sing]

Feature Structures, Grammar, Parsing Feature Structures describe additional syntactic-semantic information, like category, person, number, e.g. goes  specify feature structure constraints (agreements) as part of the grammar rules during parsing, check agreements of feature structures (unification) example S → NP VP =

Feature Structures as Constraints Ungrammatical sentences like “He go” or “We goes” can be excluded using feature constraints. example S → NP VP = =

Add to feature structure category cat: buys cat verb agreement [Person 3 ] [Number sing] Feature Structures and Categories

Compare and combine feature structures: he buys buys cat verb agreement [Person 3 ] [Number sing] he cat noun agreement [Person 3 ] [Number sing] Feature Structures and Unification 1

S → NP VP = = buys cat verb agreement [Person 3 ] [Number sing] he cat noun agreement [Person 3 ] [Number sing] Using Feature Structures

Unification of Feature Structures Agreement is checked by the unification operation according to the following rules: [feature i value i ] |_| [feature i value i ] = [feature i value i ] [feature i value i ] |_| [feature i value j ] = fail if value i  value j [feature i value i ] |_| [feature i undef.] = [feature i value i ] [feature i value i ] |_| [feature j value j ] = feature i value i feature j value j if feature i  feature j

Features and Subcategorization 1 NP modifiers or Verb complements central noun + modifiers + agreement central verb + complements + agreements “... the man who chased the cat out of the house...” “... the man chased the barking dog who bit him...” Agreements are passed on / inherited within phrases, e.g. agreement of VP derived from Head-Verb of VP: determined by

Features and Subcategorization 2 NP modifiers or Verb complements: central noun + modifiers + agreement central verb + complements + agreements “... the man who chased the cat out of the house...” “... the man chased the barking dog who bit him...” Agreements are passed on / inherited within phrases, e.g. agreement of VP derived from Head-Verb of VP: determined by

Semantics

Distinguish between surface structure (syntactic structure) and deep structure (semantic structure) of sentences. Different forms of Semantic Representation logic formalisms ontology / semantic representation languages –Case Frame Structures (Filmore) –Conceptual Dependy Theory (Schank) –DL and similar KR languages –Ontologies

Semantic Representations Semantic Representations based on some form of (formal) Representation Language. –Semantics Networks –Conceptual Dependency Graphs –Case Frames –Ontologies –DL and similar KR languages

Constructing a Semantic Representation General:  Start with surface structure  Derived from parser.  Map surface structure to semantic structure  Use phrases as sub-structures.  Find concepts and representations for central phrases (e.g. VP, NP, then PP)  Assign phrases to appropriate roles around central concepts (e.g. bind PP into VP representation).

Ontology (Interlingua) approach Ontology: a language-independent classification of objects, events, relations A Semantic Lexicon, which connects lexical items to nodes (concepts) in the ontology An analyzer that constructs Interlingua representations and selects (an?) appropriate one (based on Steve Helmreich's 419 Class, Nov 2003)

Semantic Lexicon Provides a syntactic context for the appearance of the lexical item Provides a mapping for the lexical item to a node in the ontology (or more complex associations) Provides connections from the syntactic context to semantic roles and constraints on these roles

Deriving Basic Semantic Dependency (a toy example) Input: John makes tools Syntactic Analysis: catverb tensepresent subject root john catnoun-proper object root tool catnoun numberplural Deriving Basic Semantic Dependency

John-n1 syn-struc rootjohn catnoun-proper sem-struc human name john gendermale tool-n1 syn-struc roottool catn sem-struc tool Lexicon Entries for John and tool

Relevant Extract from the Specification of the Ontological Concept Used to Describe the Appropriate Meaning of make: manufacturing-activity... agenthuman themeartifact … Meaning Representation - Example make

John-n1 syn-struc rootjohn catnoun-proper sem-struc human name john gendermale tool-n1 syn-struc roottool catn sem-struc tool Relevant parts of the (appropriate senses of the) lexicon entries for John and tool

The basic semantic dependency component of the TMR for John makes tools manufacturing-activity-7 agentuman-3 themeset-1 element tool cardinality> 1 … Semantic Dependency Component

try-v3 syn-struc root try cat v subj root $var1 cat n xcomp root $var2 cat v formOR infinitive gerund sem-struc set-1element-typerefsem-1 cardinality>=1 refsem-1 semevent agent^$var1 effectrefsem-2 modality modality-typeepiteuctic modality-scoperefsem-2 modality-value< 1 refsem-2value^$var2 semevent

Constructing an IL representation For each syntactic analysis: Access all semantic mappings and contexts for each lexical item. Create all possible semantic representations. Test them for coherency of structure and content.

“Why is Iraq developing weapons of mass destruction?”

Word sense disambiguation  Constraint checking – making sure the constraints imposed on context are met  Graph traversal – is-a links are inexpensive  Other links are more expensive  The “cheapest” structure is the most coherent  Hunter-gatherer processing

Logic Formalisms Lambda Calculus

Semantics - Lambda Calculus 1 Logic representations often involve Lambda-Calculus: represent central phrases (e.g. verb) as - expressions -expression is like a function which can be applied to terms insert semantic representation of complement or modifier phrases etc. in place of variables  x, y: loves (x, y)FOPLsentence x y loves (x, y) -expression, function x y loves (x, y) (John)  y loves (John, y)

Semantics - Lambda Calculus 2 Transform sentence into lambda-expression: “AI Caramba is close to ICSI.” specific: close-to (AI Caramba, ICSI) general:  x,y: close-to (x, y)  x=AI Caramba  y=ICSI Lambda Conversion: -expr: x y: close-to (x, y) (AI Caramba) Lambda Reduction: y: close-to (AI Caramba, y) close-to (AI Caramba, ICSI)

Semantics - Lambda Calculus 3 Lambda Expressions can be constructed from central expression, inserting semantic representations for complement phrases Verb  serves { x y  e IS-A(e, Serving)  Server(e,y)  Served(e,x)} represents general semantics for the verb 'serve Fill in appropriate expressions for x, y, for example 'meat' for y derived from Noun in NP as complement to Verb.

References Jurafsky, D. & J. H. Martin, Speech and Language Processing, Prentice-Hall, (Chapters 9 and 10) Helmreich, S., From Syntax to Semantics, Presentation in the Course, November 2003.