Computational Grammars Azadeh Maghsoodi. History Before 1800 1800-1900 First 20s 20s World War II Last 1950s Nowadays.

Slides:



Advertisements
Similar presentations
Computational language: week 10 Lexical Knowledge Representation concluded Syntax-based computational language Sentence structure: syntax Context free.
Advertisements

Feature Structures and Parsing Unification Grammars Algorithms for NLP 18 November 2014.
Basic Parsing with Context-Free Grammars CS 4705 Julia Hirschberg 1 Some slides adapted from Kathy McKeown and Dan Jurafsky.
BİL711 Natural Language Processing1 Problems with CFGs We know that CFGs cannot handle certain things which are available in natural languages. In particular,
1 Natural Language Processing Lecture 7 Unification Grammars Reading: James Allen NLU (Chapter 4)
GRAMMAR & PARSING (Syntactic Analysis) NLP- WEEK 4.
March 1, 2009 Dr. Muhammed Al-Mulhem 1 ICS 482 Natural Language Processing Probabilistic Context Free Grammars (Chapter 14) Muhammed Al-Mulhem March 1,
For Monday Read Chapter 23, sections 3-4 Homework –Chapter 23, exercises 1, 6, 14, 19 –Do them in order. Do NOT read ahead.
Natural Language and Speech Processing Creation of computational models of the understanding and the generation of natural language. Different fields coming.
NLP and Speech Course Review. Morphological Analyzer Lexicon Part-of-Speech (POS) Tagging Grammar Rules Parser thethe – determiner Det NP → Det.
Features & Unification Ling 571 Deep Processing Techniques for NLP January 26, 2011.
Features & Unification Ling 571 Deep Processing Techniques for NLP January 31, 2011.
Parsing with CFG Ling 571 Fei Xia Week 2: 10/4-10/6/05.
Parsing with PCFG Ling 571 Fei Xia Week 3: 10/11-10/13/05.
Amirkabir University of Technology Computer Engineering Faculty AILAB Grammars for Natural Language Ahmad Abdollahzadeh Barfouroush Mehr 1381.
Features and Unification
CS 4705 Lecture 7 Parsing with Context-Free Grammars.
1 CONTEXT-FREE GRAMMARS. NLE 2 Syntactic analysis (Parsing) S NPVP ATNNSVBD NP AT NNthechildrenate thecake.
Machine Learning in Natural Language Processing Noriko Tomuro November 16, 2006.
Fall 2004 Lecture Notes #5 EECS 595 / LING 541 / SI 661 Natural Language Processing.
Parsing SLP Chapter 13. 7/2/2015 Speech and Language Processing - Jurafsky and Martin 2 Outline  Parsing with CFGs  Bottom-up, top-down  CKY parsing.
Sanjukta Ghosh Department of Linguistics Banaras Hindu University.
Probabilistic Parsing Ling 571 Fei Xia Week 5: 10/25-10/27/05.
Basic Parsing with Context- Free Grammars 1 Some slides adapted from Julia Hirschberg and Dan Jurafsky.
1 Basic Parsing with Context Free Grammars Chapter 13 September/October 2012 Lecture 6.
(2.1) Grammars  Definitions  Grammars  Backus-Naur Form  Derivation – terminology – trees  Grammars and ambiguity  Simple example  Grammar hierarchies.
11 CS 388: Natural Language Processing: Syntactic Parsing Raymond J. Mooney University of Texas at Austin.
Lecture 1, 7/21/2005Natural Language Processing1 CS60057 Speech &Natural Language Processing Autumn 2005 Lecture 1 21 July 2005.
Lecture 12: 22/6/1435 Natural language processing Lecturer/ Kawther Abas 363CS – Artificial Intelligence.
For Friday Finish chapter 23 Homework: –Chapter 22, exercise 9.
Tree-adjoining grammar (TAG) is a grammar formalism defined by Aravind Joshi and introduced in Tree-adjoining grammars are somewhat similar to context-free.
THE BIG PICTURE Basic Assumptions Linguistics is the empirical science that studies language (or linguistic behavior) Linguistics proposes theories (models)
10. Parsing with Context-free Grammars -Speech and Language Processing- 발표자 : 정영임 발표일 :
Context-Free Parsing Read J & M Chapter 10.. Basic Parsing Facts Regular LanguagesContext-Free Languages Required Automaton FSMPDA Algorithm to get rid.
May 2006CLINT-LN Parsing1 Computational Linguistics Introduction Parsing with Context Free Grammars.
Parsing with Context Free Grammars CSC 9010 Natural Language Processing Paula Matuszek and Mary-Angela Papalaskari This slide set was adapted from: Jim.
Parsing I: Earley Parser CMSC Natural Language Processing May 1, 2003.
PARSING David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
Lecture 1, 7/21/2005Natural Language Processing1 CS60057 Speech &Natural Language Processing Autumn 2007 Lecture August 2007.
Parsing Introduction Syntactic Analysis I. Parsing Introduction 2 The Role of the Parser The Syntactic Analyzer, or Parser, is the heart of the front.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Fall 2005-Lecture 4.
CSA2050 Introduction to Computational Linguistics Lecture 1 Overview.
CPE 480 Natural Language Processing Lecture 4: Syntax Adapted from Owen Rambow’s slides for CSc Fall 2006.
Center for PersonKommunikation P.1 Background for NLP Questions brought up by N. Chomsky in the 1950’ies: –Can a natural language like English be described.
CSA2050 Introduction to Computational Linguistics Parsing I.
Basic Parsing Algorithms: Earley Parser and Left Corner Parsing
© 2005 Hans Uszkoreit FLST WS 05/06 FLST Grammars and Parsing Hans Uszkoreit.
CS 4705 Lecture 10 The Earley Algorithm. Review Top-Down vs. Bottom-Up Parsers –Both generate too many useless trees –Combine the two to avoid over-generation:
For Friday Finish chapter 23 Homework –Chapter 23, exercise 15.
Supertagging CMSC Natural Language Processing January 31, 2006.
CS 4705 Lecture 7 Parsing with Context-Free Grammars.
CPSC 422, Lecture 27Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27 Nov, 16, 2015.
English Syntax Read J & M Chapter 9.. Two Kinds of Issues Linguistic – what are the facts about language? The rules of syntax (grammar) Algorithmic –
1 Natural Language Processing Lecture 6 Features and Augmented Grammars Reading: James Allen NLU (Chapter 4)
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Fall 2005-Lecture 3.
Instructor: Nick Cercone CSEB - 1 Parsing and Context Free Grammars Parsers, Top Down, Bottom Up, Left Corner, Earley.
Parser: CFG, BNF Backus-Naur Form is notational variant of Context Free Grammar. Invented to specify syntax of ALGOL in late 1950’s Uses ::= to indicate.
NATURAL LANGUAGE PROCESSING
Parallel Tools for Natural Language Processing Mark Brigham Melanie Goetz Andrew Hogue / March 16, 2004.
Formal grammars A formal grammar is a system for defining the syntax of a language by specifying sequences of symbols or sentences that are considered.
By Kyle McCardle.  Issues with Natural Language  Basic Components  Syntax  The Earley Parser  Transition Network Parsers  Augmented Transition Networks.
MENTAL GRAMMAR Language and mind. First half of 20 th cent. – What the main goal of linguistics should be? Behaviorism – Bloomfield: goal of linguistics.
CS416 Compiler Design1. 2 Course Information Instructor : Dr. Ilyas Cicekli –Office: EA504, –Phone: , – Course Web.
Natural Language Processing : Probabilistic Context Free Grammars Updated 8/07.
Roadmap Probabilistic CFGs –Handling ambiguity – more likely analyses –Adding probabilities Grammar Parsing: probabilistic CYK Learning probabilities:
Lecture – VIII Monojit Choudhury RS, CSE, IIT Kharagpur
Basic Parsing with Context Free Grammars Chapter 13
CS 388: Natural Language Processing: Syntactic Parsing
Parsing I: CFGs & the Earley Parser
Artificial Intelligence 2004 Speech & Natural Language Processing
Presentation transcript:

Computational Grammars Azadeh Maghsoodi

History Before First 20s 20s World War II Last 1950s Nowadays

Before 1800 Traditional Grammar Correct Speech of a specific language Not scientific Rejected Useful issues: POS

Indian-European languages Language vs. Other languages Language vs. its history

Early 20s Enough Philology! Language in a specific time

20s America & Western Europe Intellectual Pattern Understanding Processes in human being

World War II Math. Logic as a study tool Computer invention caused new App Abstract Mind model ends Behaviorism

Late 1950 Chomsky is coming! Formal Language Theory “Syntactic Structures” Language Categories – Type 0: Natural (Irregular) – Type 1: Context sensitive – Type 2: Context free – Type 3: Regular

Late 1950 (continue) Chomsky followers professes: – Generative grammar: Accurate and definite enough for testing Generative Grammars – Goal: Unaware knowledge of users – Biologic and inborn basis for linguistic abilities Universal Grammar Shared structures

Nowadays Motives – Discover human mind structure – Language process technology Applications – Word processors – MT – Word predictors – Text predictors – UFIs / DB Queries – Information retrieval

Syntactic Model Grammars Parse Algorithms

Computational Grammars Generative Grammars – Caused by Natural Language Theory – Introduced by Chomsky – Accurate and definite structures – Transformational grammar (TG) – Constraint-Based Lexicalist grammar (CBLG)

TG Less computational efficiency Theoretical basis Complex rules Simple lexicons

TG (continue) Chomsky hierarchy & First TG Standard Theory (1965) Extended Standard Theory Government & Binding Theory ( )

Standard Theory Sentence – Deep structure – Surface structure Generative TG – Basic part Produce deep structure CFG – Transformational part Transformational Rules

Convert deep structure to surface structure Transformational Rule ~ Transformation Example: (same deep structures) – (i) The boys place the book on the table. – (ii) The boy has placed the book on the table. – (iii) Did the boy place the book on the table?

Transformational Rules (example) A deep structure: S NP N the D boy VP Aux will V place NP The book

Transformational Rules (example) To produce yes/no question: – Using a Move Transformation – S[NP VP [AUX V NP]] S[AUX NP VP[V NP]] S NPVP AuxVNP S AuxNPVP VNP

Government and Binding Theory (GB) Universal grammar theory Learning a language = confirming a small set of parameters + learning lexicons Move α: deep structure to surface structure ‘Move α’ moves anything to anywhere Some constraints correct ‘Move α’

GB (continue) Lexicons Deep Struct Surface Struct Logical FormPhonological Form Move-α LF Move-α Stylistic & Phonological Rules

GB (continue) Minimalist Program (MP) – Choose the best candidate instead of direct production – Under study

CBLG Based on TGs Increase computational efficiency of grammars Simple rules Complex lexicons Psychological Computational

CBLG (continue) Constraint-Based architecture – Constraint satisfaction more important than transformational derivation Strict lexicalism – Lexicons: syntactic atoms of a language – Independent Internal structure from syntactic constraints

CBLG (continue) Surface structures are produced directly Most computational grammars are CBLG

Computational Grammars Unification grammar (UG) Categorical grammar (CG) Dependency grammar (DG) Link grammar Lexical/Functional grammar (LFG) Tree Adjoining grammar (TAG) Generalized Phrase Structure grammar (GPSG) Head Driven Phrase Structure grammar (HPSG)

Unification Grammar (UG) Lots of CBLs are UG Augmented CFG – CFG can’t recognize long distance dependencies – A generalized form of CFG + A set of features – Augmented Transition Network (ATN) – Definite Clause Grammar (DCG) Unification Grammars

UG (continue) Unification Grammars – Feature structures are extended – No need to CFGs – Grammar ~ A set of constraints between feature structures – Key concept: Subsumption relation

UG (continue) CAT verb ROOT cry CAT verb ROOT cry CAT verb VFORM present VFORM present (Unificator)

UG (example) S NP VP Unification grammar: X0 X1 X2 CAT 0 = 5 CAT 1 = NP CAT 2 VP AGR 0 = AGR 1 = AGR 2 VFORM 0 = VORM 2

UG (continue) More grammar information are stored in lexicons Less grammar rules Using DAGs

ATN Grammar Transitive network ~ Expanded Finite-State machine ATN Grammar ~ A set of transitive networks Features Constraints

Categorical Grammar (CG) Lots of bases are omitted No difference between lexicons and none- lexicons Part Of Speech is replaced by some complex category NP/S : NP is on the right NP\S : NP is on the left

CG (example) Peter : NP Likes : (NP\S)/NP Peanuts : NP Passionately : (NP\S)\(NP\S) Peter likes peanuts passionately.

CG (example) S NP Peter NP\S (NP\S)/NP Likes NP peanuts (NP\S)\(NP\S) passionately

Dependency Grammar (DG) American linguists Based on TGs Dependencies between words Dependency tree V N boys playAdv well

Link Grammar Planarity phenomenon Legal sequence of words: – Satisfy local necessities (satisfaction) – No crossed conjunctions (planarity) – One connected graph (connectivity) CFG Lexical grammars – Grammar is distributed between words Probability models Voice recognition Hand-written recognition

Link Grammar (example) linking requirements:

Link Grammar (example) linking requirements are satisfied

Link grammar (example) Not part of a language

Lexical-Functional Grammar (LFG) Unification grammar Not TG ATN research and its deficiencies introduced LFG Group structures 4 structures

Tree Adjoining Grammar (TAG) Between CFG and CSG Grammar rules are a set of initial trees Initial trees are anchored trees Two main operations: – Substitution – Adjoin High accuracy

TAG (example) S VP S NP VP + VP ADV  NP VP V NP VP ADV V NP

TAG (continue) High accuracy Apps in NLP – MT – Information retrieval – …

Generalized Phrase Structure grammar (GPSG) Only CFLs CFG Rules – Immediate Dominance (ID) – Linear Precedence (LP)

Head Driven Phrase Structure grammar (HPSG) Lexical grammar Based on unification Increase computational potency of GPSG Simple CFG Complex lexicons

Applications

Parse Algorithms Top-Down parsing Bottom-Up parsing (*)

Parse Algorithms Top-Down parsing Chart parser – Dynamic Programming Recursive Transition Network (RTN) – ATN grammar LR parser – Shift-Reduce algorithms Cocke-Younger-Kasami parser (CYK) – Dynamic Programming – CNF grammar

Efficient Algorithms Chart parser CYK parser

Questions???