 2003 CSLI Publications Ling 566 Oct 17, 2011 How the Grammar Works.

Slides:



Advertisements
Similar presentations
Computational language: week 10 Lexical Knowledge Representation concluded Syntax-based computational language Sentence structure: syntax Context free.
Advertisements

Linguistic Theory Lecture 11 Explanation.
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
Feature Structures and Parsing Unification Grammars Algorithms for NLP 18 November 2014.
BİL711 Natural Language Processing1 Problems with CFGs We know that CFGs cannot handle certain things which are available in natural languages. In particular,
CSE Introduction to Computational Linguistics Tuesdays, Thursdays 14:30-16:00 – South Ross 101 Fall Semester, 2011 Instructor: Nick Cercone
Grammars.
Best-First Search: Agendas
Week 3b. Constituents CAS LX 522 Syntax I.
LTAG Semantics on the Derivation Tree Presented by Maria I. Tchalakova.
Natural Language Processing - Feature Structures - Feature Structures and Unification.
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
NLP and Speech 2004 Feature Structures Feature Structures and Unification.
1 Introduction to Computability Theory Lecture12: Decidable Languages Prof. Amos Israeli.
Honors Compilers An Introduction to Grammars Feb 12th 2002.
Constraint Logic Programming Ryan Kinworthy. Overview Introduction Logic Programming LP as a constraint programming language Constraint Logic Programming.
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books עיבוד שפות טבעיות - שיעור עשר Chart Parsing (cont) Features.
Amirkabir University of Technology Computer Engineering Faculty AILAB Efficient Parsing Ahmad Abdollahzadeh Barfouroush Aban 1381 Natural Language Processing.
Analyzing Features of Grammatical Categories Show my head to the people; it is worth seeing. --Feature structure, to Ivan Sag in a dream.
DS-to-PS conversion Fei Xia University of Washington July 29,
Sag et al., Chapter 4 Complex Feature Values 10/7/04 Michael Mulyar.
Installment 10b. Raising, etc CAS LX 522 Syntax I.
CS 330 Programming Languages 09 / 13 / 2007 Instructor: Michael Eckmann.
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
Features and Unification
 2003 CSLI Publications Ling 566 Oct 16, 2007 How the Grammar Works.
CS 330 Programming Languages 09 / 16 / 2008 Instructor: Michael Eckmann.
 2003 CSLI Publications Ling 566 Oct 18, 2007 Review.
Context-Free Parsing Part 2 Features and Unification.
Linguistic Theory Lecture 3 Movement. A brief history of movement Movements as ‘special rules’ proposed to capture facts that phrase structure rules cannot.
Embedded Clauses in TAG
CAS LX 502 Semantics 3a. A formalism for meaning (cont ’ d) 3.2, 3.6.
Syntax & Semantic Introduction Organization of Language Description Abstract Syntax Formal Syntax The Way of Writing Grammars Formal Semantic.
IV. SYNTAX. 1.1 What is syntax? Syntax is the study of how sentences are structured, or in other words, it tries to state what words can be combined with.
CSE Introduction to Computational Linguistics Tuesdays, Thursdays 14:30-16:00 – South Ross 101 Fall Semester, 2011 Instructor: Nick Cercone
Writing an ERG mal-rule David Mott IBM Emerging Technology Services.
May 2006CLINT-LN Parsing1 Computational Linguistics Introduction Parsing with Context Free Grammars.
Compression.  Compression ratio: how much is the size reduced?  Symmetric/asymmetric: time difference to compress, decompress?  Lossless; lossy: any.
Slide 1 Propositional Definite Clause Logic: Syntax, Semantics and Bottom-up Proofs Jim Little UBC CS 322 – CSP October 20, 2014.
2007CLINT-LIN-FEATSTR1 Computational Linguistics for Linguists Feature Structures.
Head-driven Phrase Structure Grammar (HPSG)
Semantic Construction lecture 2. Semantic Construction Is there a systematic way of constructing semantic representation from a sentence of English? This.
Albert Gatt LIN3021 Formal Semantics Lecture 4. In this lecture Compositionality in Natural Langauge revisited: The role of types The typed lambda calculus.
Programming Languages and Design Lecture 3 Semantic Specifications of Programming Languages Instructor: Li Ma Department of Computer Science Texas Southern.
1 Compiler Construction (CS-636) Muhammad Bilal Bashir UIIT, Rawalpindi.
Rules, Movement, Ambiguity
Artificial Intelligence: Natural Language
1 Context Free Grammars October Syntactic Grammaticality Doesn’t depend on Having heard the sentence before The sentence being true –I saw a unicorn.
The Minimalist Program
 2003 CSLI Publications Ling 566 Oct 20, 2015 How the Grammar Works.
Levels of Linguistic Analysis
1 Chapter 9 Undecidability  Turing Machines Coded as Binary Strings  Universal Turing machine  Diagonalizing over Turing Machines  Problems as Languages.
Chapter 3 Language Acquisition: A Linguistic Treatment Jang, HaYoung Biointelligence Laborotary Seoul National University.
Week 3. Clauses and Trees English Syntax. Trees and constituency A sentence has a hierarchical structure Constituents can have constituents of their own.
SYNTAX.
King Faisal University جامعة الملك فيصل Deanship of E-Learning and Distance Education عمادة التعلم الإلكتروني والتعليم عن بعد [ ] 1 King Faisal University.
10/31/00 1 Introduction to Cognitive Science Linguistics Component Topic: Formal Grammars: Generating and Parsing Lecturer: Dr Bodomo.
Natural Language Processing Vasile Rus
Natural Language Processing Vasile Rus
CSC 594 Topics in AI – Natural Language Processing
Modeling Grammaticality
Instructor: Nick Cercone CSEB -
Instructor: Nick Cercone CSEB -
Introduction to Computational Linguistics
Fundamentals of Data Representation
Levels of Linguistic Analysis
Ling 566 Oct 14, 2008 How the Grammar Works.
Modeling Grammaticality
Structure of a Lexicon Debasri Chakrabarti 13-May-19.
Rubric (Four 5-point scale=20 points)
Presentation transcript:

 2003 CSLI Publications Ling 566 Oct 17, 2011 How the Grammar Works

 2003 CSLI Publications 2 Overview What we’re trying to do The pieces of our grammar Two extended examples Reflection on what we’ve done, what we still have to do Reading questions

 2003 CSLI Publications 3 Objectives Develop a theory of knowledge of language Represent linguistic information explicitly enough to distinguish well-formed from ill-formed expressions Be parsimonious, capturing linguistically significant generalizations. Why Formalize? To formulate testable predictions To check for consistency To make it possible to get a computer to do it for us What We’re Trying To Do

 2003 CSLI Publications 4 The Components of Our Grammar Grammar rules Lexical entries Principles Type hierarchy (very preliminary, so far) Initial symbol (S, for now) We combine constraints from these components. Q: What says we have to combine them? How We Construct Sentences

 2003 CSLI Publications 5 A cat slept. Can we build this with our tools? Given the constraints our grammar puts on well-formed sentences, is this one? An Example

 2003 CSLI Publications 6 Is this a fully specified description? What features are unspecified? How many word structures can this entry license? Lexical Entry for a

 2003 CSLI Publications 7 Which feature paths are abbreviated? Is this a fully specified description? What features are unspecified? How many word structures can this entry license? Lexical Entry for cat

 2003 CSLI Publications 8 Effect of Principles: the SHAC

 2003 CSLI Publications Description of Word Structures for cat

 2003 CSLI Publications 10 Description of Word Structures for a

 2003 CSLI Publications 11 Building a Phrase

 2003 CSLI Publications 12 Constraints Contributed by Daughter Subtrees

 2003 CSLI Publications 13 Constraints Contributed by the Grammar Rule

 2003 CSLI Publications 14 A Constraint Involving the SHAC

 2003 CSLI Publications 15 Effects of the Valence Principle

 2003 CSLI Publications 16 Effects of the Head Feature Principle

 2003 CSLI Publications 17 Effects of the Semantic Inheritance Principle

 2003 CSLI Publications 18 Effects of the Semantic Compositionality Principle

 2003 CSLI Publications 19 Is the Mother Node Now Completely Specified?

 2003 CSLI Publications 20 Lexical Entry for slept

 2003 CSLI Publications 21 Another Head-Specifier Phrase HSR SHAC Val Prin HFP SIP SCP Key

 2003 CSLI Publications Is this description fully specified?

 2003 CSLI Publications Does the top node satisfy the initial symbol?

 2003 CSLI Publications 24 RESTR of the S node

 2003 CSLI Publications 25 Another Example

 2003 CSLI Publications 26 Head Features from Lexical Entries

 2003 CSLI Publications 27 Head Features from Lexical Entries, plus HFP

 2003 CSLI Publications Valence Features: Lexicon, Rules, and the Valence Principle Lexicon Val. Prin. Rules Key

 2003 CSLI Publications Required Identities: Grammar Rules

 2003 CSLI Publications Two Semantic Features: the Lexicon & SIP

 2003 CSLI Publications 31 RESTR Values and the SCP

 2003 CSLI Publications 32 An Ungrammatical Example What’s wrong with this sentence?

 2003 CSLI Publications 33 An Ungrammatical Example What’s wrong with this sentence? So what?

 2003 CSLI Publications 34 An Ungrammatical Example The Valence Principle

 2003 CSLI Publications 35 An Ungrammatical Example Head Specifier Rule ← contradiction →

 2003 CSLI Publications 36 Exercise in Critical Thinking Our grammar has come a long way since Ch 2, as we've added ways of representing different kinds of information: generalizations across categories semantics particular linguistic phenomena: valence, agreement, modification What else might we add? What facts about language are as yet unrepresented in our model?

 2003 CSLI Publications 37 Overview What we’re trying to do The pieces of our grammar Two extended examples Reflection on what we’ve done, what we still have to do Reading questions Next time: Catch up & review

 2003 CSLI Publications 38 Reading Questions Do we have to understand the squiggly bits? Does RESTR mean we can take any structure and make an infinite number of variations on it? Does dative shift mean that we need to lexical entries for every dative shift verb? Is [AGR non-3sing] the same as not [AGR 3sing]?

 2003 CSLI Publications 39 Reading Questions Are the grammar rules applied in some defined order of priority (e.g., HCR before HSR), or can they be applied in any order? Re (17) on p.179 the text says that the predications are listed “in the indicated order”. To what extent is that order meaningful?

 2003 CSLI Publications 40 Reading Questions The lexical entry for letter has an ADDRESSEE role in its semantics, but in They sent us a letter, the text claims that nothing is coindexed with that role. Why isn’t us coindexed with it?

 2003 CSLI Publications 41

 2003 CSLI Publications 42 Reading Questions Do the two structures for We send two letters to Lee have the same meaning?

 2003 CSLI Publications 43

 2003 CSLI Publications 44

 2003 CSLI Publications 45

 2003 CSLI Publications 46

 2003 CSLI Publications 47 Reading Questions How much of the lexical entries will we be expected to include in our assignments? Will this be specified in the problems? To what extent can we use abbreviations? Should this be called “lexically driven phrase structure grammar” instead? Is it feasible to show complete trees of complex sentences in any reasonable amount of space?

 2003 CSLI Publications 48 Reading Questions Page 173: (10) obeys the Valence Principle. Doesn't that apply to all of VAL? Shouldn't the phrase have [2] in its SPR list? It says here that because of the Head Specifier rule, the mother's SPR list is empty. Do rules overtake principles? Furthermore, it appears to be completely contradicted on page 177, where it says (14) obeys the Head Complement Rule, so that the specifier bubbles up but the mother's COMPS list is empty. How can both happen at the same time, and what's the point of the Valence Principle if we're ignoring it because of these rules?

 2003 CSLI Publications 49 Reading Questions Valence Principle: Unless the rule says otherwise, the mother's values for the VAL features (SPR, COMPS, and MOD) are identical to those of the head daughter.