101035 中文信息处理 Chinese NLP Lecture 9.

Slides:



Advertisements
Similar presentations
Basic Parsing with Context-Free Grammars CS 4705 Julia Hirschberg 1 Some slides adapted from Kathy McKeown and Dan Jurafsky.
Advertisements

Natural Language Processing - Parsing 1 - Language, Syntax, Parsing Problems in Parsing Ambiguity, Attachment / Binding Bottom vs. Top Down Parsing.
May 2006CLINT-LN Parsing1 Computational Linguistics Introduction Approaches to Parsing.
PARSING WITH CONTEXT-FREE GRAMMARS
Parsing with Context Free Grammars Reading: Chap 13, Jurafsky & Martin
GRAMMAR & PARSING (Syntactic Analysis) NLP- WEEK 4.
CKY Parsing Ling 571 Deep Processing Techniques for NLP January 12, 2011.
Artificial Intelligence 2004 Natural Language Processing - Syntax and Parsing - Language, Syntax, Parsing Problems in Parsing Ambiguity, Attachment.
1 Earley Algorithm Chapter 13.4 October 2009 Lecture #9.
 Christel Kemke /08 COMP 4060 Natural Language Processing PARSING.
Parsing context-free grammars Context-free grammars specify structure, not process. There are many different ways to parse input in accordance with a given.
Albert Gatt LIN3022 Natural Language Processing Lecture 8.
Parsing with CFG Ling 571 Fei Xia Week 2: 10/4-10/6/05.
Earley’s algorithm Earley’s algorithm employs the dynamic programming technique to address the weaknesses of general top-down parsing. Dynamic programming.
CS 4705 Lecture 7 Parsing with Context-Free Grammars.
Syntactic Parsing with CFGs CMSC 723: Computational Linguistics I ― Session #7 Jimmy Lin The iSchool University of Maryland Wednesday, October 14, 2009.
Artificial Intelligence 2004 Natural Language Processing - Syntax and Parsing - Language Syntax Parsing.
Parsing SLP Chapter 13. 7/2/2015 Speech and Language Processing - Jurafsky and Martin 2 Outline  Parsing with CFGs  Bottom-up, top-down  CKY parsing.
Basic Parsing with Context- Free Grammars 1 Some slides adapted from Julia Hirschberg and Dan Jurafsky.
Context-Free Grammar CSCI-GA.2590 – Lecture 3 Ralph Grishman NYU.
1 Basic Parsing with Context Free Grammars Chapter 13 September/October 2012 Lecture 6.
中文信息处理 Chinese NLP Lecture 8.
11 CS 388: Natural Language Processing: Syntactic Parsing Raymond J. Mooney University of Texas at Austin.
PARSING David Kauchak CS457 – Fall 2011 some slides adapted from Ray Mooney.
CS460/626 : Natural Language Processing/Speech, NLP and the Web (Lecture 29– CYK; Inside Probability; Parse Tree construction) Pushpak Bhattacharyya CSE.
1 Basic Parsing with Context- Free Grammars Slides adapted from Dan Jurafsky and Julia Hirschberg.
Intro to NLP - J. Eisner1 Earley’s Algorithm (1970) Nice combo of our parsing ideas so far:  no restrictions on the form of the grammar:  A.
CS 4705 Parsing More Efficiently and Accurately. Review Top-Down vs. Bottom-Up Parsers Left-corner table provides more efficient look- ahead Left recursion.
1 CKY and Earley Algorithms Chapter 13 October 2012 Lecture #8.
Chapter 10. Parsing with CFGs From: Chapter 10 of An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition, by.
LINGUISTICA GENERALE E COMPUTAZIONALE ANALISI SINTATTICA (PARSING)
10. Parsing with Context-free Grammars -Speech and Language Processing- 발표자 : 정영임 발표일 :
11 Syntactic Parsing. Produce the correct syntactic parse tree for a sentence.
May 2006CLINT-LN Parsing1 Computational Linguistics Introduction Parsing with Context Free Grammars.
October 2005csa3180: Parsing Algorithms 11 CSA350: NLP Algorithms Sentence Parsing I The Parsing Problem Parsing as Search Top Down/Bottom Up Parsing Strategies.
Parsing with Context Free Grammars CSC 9010 Natural Language Processing Paula Matuszek and Mary-Angela Papalaskari This slide set was adapted from: Jim.
Parsing I: Earley Parser CMSC Natural Language Processing May 1, 2003.
PARSING David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
6/2/2016CPSC503 Winter CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini.
Lecture 1, 7/21/2005Natural Language Processing1 CS60057 Speech &Natural Language Processing Autumn 2007 Lecture August 2007.
1 Chart Parsing Allen ’ s Chapter 3 J & M ’ s Chapter 10.
Sentence Parsing Parsing 3 Dynamic Programming. Jan 2009 Speech and Language Processing - Jurafsky and Martin 2 Acknowledgement  Lecture based on  Jurafsky.
Natural Language - General
Basic Parsing Algorithms: Earley Parser and Left Corner Parsing
PARSING 2 David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
NLP. Introduction to NLP Motivation –A lot of the work is repeated –Caching intermediate results improves the complexity Dynamic programming –Building.
Quick Speech Synthesis CMSC Natural Language Processing April 29, 2003.
CPSC 503 Computational Linguistics
CS 4705 Lecture 10 The Earley Algorithm. Review Top-Down vs. Bottom-Up Parsers –Both generate too many useless trees –Combine the two to avoid over-generation:
csa3050: Parsing Algorithms 11 CSA350: NLP Algorithms Parsing Algorithms 1 Top Down Bottom-Up Left Corner.
Computerlinguistik II / Sprachtechnologie Vorlesung im SS 2010 (M-GSW-10) Prof. Dr. Udo Hahn Lehrstuhl für Computerlinguistik Institut für Germanistische.
CS 4705 Lecture 7 Parsing with Context-Free Grammars.
Natural Language Processing Lecture 15—10/15/2015 Jim Martin.
GRAMMARS David Kauchak CS457 – Spring 2011 some slides adapted from Ray Mooney.
Instructor: Nick Cercone CSEB - 1 Parsing and Context Free Grammars Parsers, Top Down, Bottom Up, Left Corner, Earley.
October 2005CSA3180: Parsing Algorithms 21 CSA3050: NLP Algorithms Parsing Algorithms 2 Problems with DFTD Parser Earley Parsing Algorithm.
NLP. Introduction to NLP Time flies like an arrow –Many parses –Some (clearly) more likely than others –Need for a probabilistic ranking method.
PARSING David Kauchak CS159 – Fall Admin Assignment 3 Quiz #1  High: 36  Average: 33 (92%)  Median: 33.5 (93%)
Speech and Language Processing SLP Chapter 13 Parsing.
Parsing with Context Free Grammars. Slide 1 Outline Why should you care? Parsing Top-Down Parsing Bottom-Up Parsing Bottom-Up Space (an example) Top -
1 Statistical methods in NLP Diana Trandabat
Parsing Recommended Reading: Ch th Jurafsky & Martin 2nd edition
Basic Parsing with Context Free Grammars Chapter 13
CPSC 503 Computational Linguistics
CSCI 5832 Natural Language Processing
Natural Language - General
Earley’s Algorithm (1970) Nice combo of our parsing ideas so far:
Parsing and More Parsing
Parsing I: CFGs & the Earley Parser
David Kauchak CS159 – Spring 2019
Presentation transcript:

101035 中文信息处理 Chinese NLP Lecture 9

句——语法分析(2) Grammatical Analysis (2) 句法分析(Syntactic parsing) 搜索式句法分析(Parsing as search) 结构歧义(Structural ambiguities) 动态规划句法分析(Dynamic programming parsing)

句法分析 Syntactic Parsing Basics The goal of syntactic parsing is to construct a parse tree for a given sentence, based on a grammar or rule system. Parsing is essentially searching in the rule space by finding all possible rule combinations. The search is successful when a combination is found, wherein the rules can be used to generate a parse tree to represent the sentence structure.

Parsing Methods Basic Searching Methods Dynamic Programming Methods Top-down Bottom-up Dynamic Programming Methods CKY Early Chart Parsing Statistical-based Methods Probabilistic parsing

搜索式句法分析 Parsing as Search Top-Down Parsing A top-down parser builds a tree from the root node S down to the leaves. Bottom-Up Parsing A bottom-up parser starts with the words of the input and builds a tree rooted in the symbol of S.

Example Book that flight.

Example Top-down parsing Book that flight. …

Example Bottom-up parsing Book that flight.

In-Class Exercise Provide the omitted steps of the top-down parsing in order to derive the final correct parse tree. (To save space, terminal nodes can be found in only one step.)

Top-Down vs Bottom-Up Top-down parsing does not waste time exploring trees that cannot result in an S, but bottom-up parsing generates many trees unable to lead to an S. Bottom-up parsing always generates trees that are consistent with the input words, but top-down parsing spends considerable effort on S trees that are not consistent with the input.

结构歧义 Structural Ambiguities A Major Challenge One sentence usually corresponds to more than one parse tree, rendering different meanings. Structural ambiguities are a major challenge for syntactic parsing.

I shot an elephant in my pajamas. Ambiguities in English Attachment ambiguity I shot an elephant in my pajamas.

I can see old men and women in the park. Ambiguities in English Attachment ambiguity Ambiguities in Chinese “VP+的+是+NP”型 “N1+N2+N3”型 “ADJ+N1+N2”型 “VP+N1的+N2”型 I can see old men and women in the park. 反对的是少数人 北欧语言研究会 小学生词典 咬死了猎人的狗

Ambiguities in Chinese “N1+的+N2+和+N3”型 “V+N1+N2”型 “MQ+NP1+的+NP2”型 “VP+ MQ +NP"型 衣服的袖子和口袋 赠意大利图书 三个学校的实验员 发了三天工资

动态规划句法分析 Dynamic Programming Parsing Features Dynamic programming parsing methods are efficient because subtrees are discovered once, stored, and then used in all parses calling for that constituent. It partially solves the ambiguity problem by storing all possible parses.

CKY Parsing A dynamic programming bottom-up parsing method Book the flight through Houston. Every non-terminal rule must be converted to CNF

CKY Parsing For a sentence of length n, CKY deals with the upper-triangular portion of an (n+1)×(n+1) matrix. Each cell [i, j] in this matrix contains a set of non-terminals that represent all the constituents that span positions i through j of the input. 0 Book 1 that 2 flight 3 [0, 3] CKY parsing is parse table filling.

CKY Parsing Algorithm

Book the flight through Houston. CKY Parsing Example Book the flight through Houston.

Book the flight through Houston. CKY Parsing Book the flight through Houston.

In-Class Exercise When CKY ends (on the previous page), it generates 3 possible parses at once (S1, S2, S3). Please draw their corresponding parse trees.

The dot lies at position 2 Earley A dynamic programming top-down parsing method Earley algorithm is a single left-to-right pass that fills an array called a chart that has N +1 entries. Earley’s word indexing method is the same as CKY’s. Dotted rule The structure of a state of the chart with a dot (•) A state’s position with respect to the input are represented by two numbers indicating where the state begins and where its dot lies. NP → Det • Nominal, [1,2] The dot lies at position 2 Parsed Expected NP begins at position 1 S → α•, [0,N] Successful parse

Earley Predictor Scanner Completer It creates new states representing top-down expectations generated during the parsing process. It is applied to non-terminal to the right of the dot. Scanner When a state has a POS category to the right of the dot, Scanner is called to examine the input and incorporate a state corresponding to the prediction of a word with a particular POS into the chart. Completer It is applied to a state when its dot has reached the right end of the rule. The purpose of Completer is to find, and advance, all previously created states that were looking for a particular grammatical category that has just been discovered.

Earley Algorithm 3 core operations

Early Example Book that flight.

Early Example Book that flight.

Early States that lead to the correct parse. Book that flight.

Wrap-Up 句法分析 动态规划句法分析 搜索式句法分析 结构歧义 Parsing Methods Features Top-Down Bottom-Up 结构歧义 Ambiguities in English Ambiguities in Chinese 动态规划句法分析 Features CKY Parsing Earley Examples