1 Grammars and Parsing Allen ’ s Chapters 3, Jurafski & Martin ’ s Chapters 8-9.

Slides:



Advertisements
Similar presentations
Prolog programming....Dr.Yasser Nada. Chapter 8 Parsing in Prolog Taif University Fall 2010 Dr. Yasser Ahmed nada prolog programming....Dr.Yasser Nada.
Advertisements

Computational language: week 10 Lexical Knowledge Representation concluded Syntax-based computational language Sentence structure: syntax Context free.
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
Basic Parsing with Context-Free Grammars CS 4705 Julia Hirschberg 1 Some slides adapted from Kathy McKeown and Dan Jurafsky.
Natural Language Processing - Parsing 1 - Language, Syntax, Parsing Problems in Parsing Ambiguity, Attachment / Binding Bottom vs. Top Down Parsing.
Grammars, constituency and order A grammar describes the legal strings of a language in terms of constituency and order. For example, a grammar for a fragment.
May 2006CLINT-LN Parsing1 Computational Linguistics Introduction Approaches to Parsing.
1 CS 385 Fall 2006 Chapter 14 Understanding Natural Language Problems.
Chapter Chapter Summary Languages and Grammars Finite-State Machines with Output Finite-State Machines with No Output Language Recognition Turing.
GRAMMAR & PARSING (Syntactic Analysis) NLP- WEEK 4.
Introduction and Jurafsky Model Resource: A Probabilistic Model of Lexical and Syntactic Access and Disambiguation, Jurafsky 1996.
For Monday Read Chapter 23, sections 3-4 Homework –Chapter 23, exercises 1, 6, 14, 19 –Do them in order. Do NOT read ahead.
Natural Language Processing - Feature Structures - Feature Structures and Unification.
Artificial Intelligence 2004 Natural Language Processing - Syntax and Parsing - Language, Syntax, Parsing Problems in Parsing Ambiguity, Attachment.
ISBN Chapter 3 Describing Syntax and Semantics.
 Christel Kemke /08 COMP 4060 Natural Language Processing PARSING.
PCFG Parsing, Evaluation, & Improvements Ling 571 Deep Processing Techniques for NLP January 24, 2011.
NLP Syntax1 Syntax The Structure of language Dave Inman.
Parsing I Miriam Butt May 2005 Jurafsky and Martin, Chapters 10, 13.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Spring 2006-Lecture 4.
Amirkabir University of Technology Computer Engineering Faculty AILAB Parsing Ahmad Abdollahzadeh Barfouroush Aban 1381 Natural Language Processing Course,
Natural Language Query Interface Mostafa Karkache & Bryce Wenninger.
1 Note As usual, these notes are based on the Sebesta text. The tree diagrams in these slides are from the lecture slides provided in the instructor resources.
1 CONTEXT-FREE GRAMMARS. NLE 2 Syntactic analysis (Parsing) S NPVP ATNNSVBD NP AT NNthechildrenate thecake.
1 Introduction: syntax and semantics Syntax: a formal description of the structure of programs in a given language. Semantics: a formal description of.
Artificial Intelligence 2004 Natural Language Processing - Syntax and Parsing - Language Syntax Parsing.
LING 364: Introduction to Formal Semantics Lecture 5 January 26th.
LING 388: Language and Computers Sandiway Fong Lecture 13: 10/10.
Context-Free Grammar CSCI-GA.2590 – Lecture 3 Ralph Grishman NYU.
(2.1) Grammars  Definitions  Grammars  Backus-Naur Form  Derivation – terminology – trees  Grammars and ambiguity  Simple example  Grammar hierarchies.
11 CS 388: Natural Language Processing: Syntactic Parsing Raymond J. Mooney University of Texas at Austin.
ITEC 380 Organization of programming languages Lecture 2 – Grammar / Language capabilities.
1 CS 385 Fall 2006 Chapter 14 Understanding Natural Language (omit 14.4)
October 2008csa3180: Setence Parsing Algorithms 1 1 CSA350: NLP Algorithms Sentence Parsing I The Parsing Problem Parsing as Search Top Down/Bottom Up.
For Friday Finish chapter 23 Homework: –Chapter 22, exercise 9.
LING 388: Language and Computers Sandiway Fong Lecture 7.
CS : Speech, Natural Language Processing and the Web/Topics in Artificial Intelligence Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 12: Deeper.
Syntax: 10/18/2015IT 3271 Semantics: Describe the structures of programs Describe the meaning of programs Programming Languages (formal languages) -- How.
Context-Free Parsing Read J & M Chapter 10.. Basic Parsing Facts Regular LanguagesContext-Free Languages Required Automaton FSMPDA Algorithm to get rid.
May 2006CLINT-LN Parsing1 Computational Linguistics Introduction Parsing with Context Free Grammars.
October 2005csa3180: Parsing Algorithms 11 CSA350: NLP Algorithms Sentence Parsing I The Parsing Problem Parsing as Search Top Down/Bottom Up Parsing Strategies.
Parsing I: Earley Parser CMSC Natural Language Processing May 1, 2003.
PARSING David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
11 Chapter 14 Part 1 Statistical Parsing Based on slides by Ray Mooney.
Syntax Why is the structure of language (syntax) important? How do we represent syntax? What does an example grammar for English look like? What strategies.
For Wednesday Read chapter 23 Homework: –Chapter 22, exercises 1,4, 7, and 14.
November 2011CLINT-LN CFG1 Computational Linguistics Introduction Context Free Grammars.
Rules, Movement, Ambiguity
The man bites the dog man bites the dog bites the dog the dog dog Parse Tree NP A N the man bites the dog V N NP S VP A 1. Sentence  noun-phrase verb-phrase.
CSA2050 Introduction to Computational Linguistics Parsing I.
Natural Language - General
PARSING 2 David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
The Simplest NL Applications: Text Searching and Pattern Matching Read J & M Chapter 2.
CS 4705 Lecture 10 The Earley Algorithm. Review Top-Down vs. Bottom-Up Parsers –Both generate too many useless trees –Combine the two to avoid over-generation:
csa3050: Parsing Algorithms 11 CSA350: NLP Algorithms Parsing Algorithms 1 Top Down Bottom-Up Left Corner.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Spring 2006-Lecture 2.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Fall 2005-Lecture 3.
NATURAL LANGUAGE PROCESSING
November 2004csa3050: Parsing Algorithms 11 CSA350: NLP Algorithms Parsing Algorithms 1 Top Down Bottom-Up Left Corner.
CS : Language Technology for the Web/Natural Language Processing Pushpak Bhattacharyya CSE Dept., IIT Bombay Parsing Algos.
Formal grammars A formal grammar is a system for defining the syntax of a language by specifying sequences of symbols or sentences that are considered.
Chapter 3 – Describing Syntax CSCE 343. Syntax vs. Semantics Syntax: The form or structure of the expressions, statements, and program units. Semantics:
CKY Parser 0Book 1 the 2 flight 3 through 4 Houston5 6/19/2018
SYNTAX.
CS : Speech, NLP and the Web/Topics in AI
CKY Parser 0Book 1 the 2 flight 3 through 4 Houston5 11/16/2018
CS 388: Natural Language Processing: Syntactic Parsing
Natural Language - General
Parsing and More Parsing
Parsing I: CFGs & the Earley Parser
Presentation transcript:

1 Grammars and Parsing Allen ’ s Chapters 3, Jurafski & Martin ’ s Chapters 8-9

2 Syntax Why is the structure of language (syntax) important? How do we represent syntax? What does an example grammar for English look like? What strategies exist to find the structure in natural language? A Prolog program to recognise English sentences

3 Syntax shows the role of words in a sentence. John hit Sue vs Sue hit John Here knowing the subject allows us to know what is going on.

4 Syntax shows how words are related in a sentence. Visiting aunts ARE boring. vs Visiting aunts IS boring. Subject verb agreement allows us to disambiguate here.

5 Syntax shows how words are related between sentences. (a) Italy was beating England. Germany too. (b) Italy was being beaten by England. Germany too. Here missing parts of a sentence does not allow us to understand the second sentence. But syntax allows us to see what is missing.

6 But syntax alone is not enough Visiting museums can be boring This is not ambiguous for us, as we know there is no such thing as a "visiting museum", but syntax cannot show this to a computer. Compare with … Visiting aunts can be boring

7 How do we represent syntax? Parse Tree

8 An example: –Parsing sentence: –"They are cooking apples."

9 Parse 1

10 Parse 2

11 How do we represent syntax? List Sue hit John [ s, [np, [proper_noun, Sue] ], [vp, [v, hit], [np, [proper_noun, John] ]

12 Chomsky Hierarchy 0 Unrestricted  A    1 Context-Sensitive| LHS |  | RHS | 2 Context-Free|LHS | = 1 3 Regular|RHS| = 1 or 2, A  a | aB, or A  a | Ba

13 What Makes a Good Grammar? Generality Selectivity Understandability

14 Generality of Grammars Regular {abd, ad, bcd, b, abcd, … } S -> a S1 | b S2 | c S3 | d S1 -> b S2 | c S3 | d S2 -> c S3 | d S3 -> d Context Free {a n b n } S -> ab | a S b Context Sensetive { a n b n c n } or {abcddabcdd, abab, asease, … }

15 What strategies exist for trying to find the structure in natural language? Top Down vs. Bottom Up Bottom - Up John, hit, the, cat prpn, hit, the, cat np, hit, the, cat np, v, the, cat np, v, det, cat np, v, det, n np, v, np np, vp s Top - Down s s -> np, vp s -> prpn, vp s -> John, vp s -> John, v, np s -> John, hit, np s -> John, hit, det,n s -> John, hit, the,n s -> John, hit, the,cat

16 What strategies exist for trying to find the structure in natural language? Top Down vs. Bottom Up Bottom - Up John, hit, the, cat prpn, hit, the, cat np, hit, the, cat np, v, the, cat np, v, det, cat np, v, det, n np, v, np np, vp s Better if many alternative rules for a phrase Worse if many alternative terminal symbols for each word Top - Down s s -> np, vp s -> prpn, vp s -> John, vp s -> John, v, np s -> John, hit, np s -> John, hit, det,n s -> John, hit, the,n s -> John, hit, the,cat Better if many alternative terminal symbols for each word Worse if many alternative rules for a phrase

17 What does an example grammar for English look like? Re-write rules 1.sentence -> noun phrase, verb phrase 2.noun phrase -> art, noun 3.noun phrase -> art, adj, noun 4.verb phrase -> verb 5.verb phrase -> verb, noun phrase

18 Parsing as a search procedure 1. Select the first state from the possibilities list (and remove it from the list). 2.Generate the new states by trying every possible option from the selected state (there may be none if we are on a bad path). 3.Add the states generated in step 2 to the possibilities list

19 Top down parsing 1 The 2 dog 3 cried 4 Step Current stateBackup States comment 1((S) 1)initial position 2((NP VP) 1)Rule 1 3((ART N VP) 1)Rules 2 & 3 ((ART ADJ N VP) 1) 4((N VP) 2)Match Art with the ((ART ADJ N VP) 1) 5((VP) 3)Match N with dog ((ART ADJ N VP) 1) 6((V) 3)Rules 4 & 5 ((V NP) 3) ((ART ADJ N VP) 1) 7Success

20 What strategies exist for trying to find the structure in natural language? Depth First vs. Breadth First Depth First Try rules one at a time and back track if you get stuck Easier to program Less memory required Good if parse tree is deep Breadth First Try all rules at the same time Can be faster Order of rules is not important Good if tree is flat

21 An Example of Top-Down Parsing 1 The 2 old 3 man 4 cried 5

22 Depth First Search versus Breadth First

23 What does a Prolog program look like that tries to recognise English sentences? s --> np vp. np --> det n. np --> det adj n. vp --> v np.

24 What does a Prolog program look like that tries to recognise English sentences? sentence(S) :- noun_phrase(NP), verb_phrase(VP), append(NP,VP,S). noun_phrase(NP) :- determiner(D), noun(N), append(D,N,NP). noun_phrase(NP) :- determiner(D), adj(A), noun(N), append(D,A,AP), append(AP,N,NP). verb_phrase(VP) :- verb(V), noun_phrase(NP), append(V,NP,VP). determiner([D]) :- member(D,[the,a,an]). noun([N]) :- member(N,[cat,dog,mat,meat,fish]). adj([A]) :- member(A,[big,fat,red]). verb([V]) :- member(V,[ate,saw,killed,pushed]).

25 Pattern matching as an alternative (e.g., Eliza) This uses a database of input output pairs. The input part of pair is a template to be matched against the user input The output part of the pair is given as a response. X computers Y => Do computers interest you? X mother Y => Tell me more about your family? But … Nothing is known about structure (syntax) I X you => Why do you X me? Fine for X = like, but not for X = do not know Nothing is known about meaning (semantics) I feel X => I'm sorry you feel X. Fine for X = depressed, but not for X = happy