NLP. Introduction to NLP Shallow parsing Useful for information extraction –noun phrases, verb phrases, locations, etc. Example –FASTUS (Appelt and Israel,

Slides:



Advertisements
Similar presentations
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
Advertisements

Feature Structures and Parsing Unification Grammars Algorithms for NLP 18 November 2014.
Notes on TAG (LTAG) and Feature Structures September AKJ.
1 Unification Grammars Allen ’ s Chapter 4 J&M ’ s Chapter 11.
BİL711 Natural Language Processing1 Problems with CFGs We know that CFGs cannot handle certain things which are available in natural languages. In particular,
Syntactic analysis using Context Free Grammars. Analysis of language Morphological analysis – Chairs, Part Of Speech (POS) tagging – The/DT man/NN left/VBD.
Introduction to Syntax Owen Rambow September 30.
Grammatical Relations and Lexical Functional Grammar Grammar Formalisms Spring Term 2004.
LING NLP 1 Introduction to Computational Linguistics Martha Palmer April 19, 2006.
1 Features and Augmented Grammars Allen ’ s Chapter 4 J&M ’ s Chapter 11.
Natural Language Processing - Feature Structures - Feature Structures and Unification.
Introduction to Syntax Owen Rambow October
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books עיבוד שפות טבעיות - שיעור עשר Chart Parsing (cont) Features.
Artificial Intelligence 2005/06 Features, Gaps, Movement Questions and Passives.
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
Features and Unification
Introduction to Syntax, with Part-of-Speech Tagging Owen Rambow September 17 & 19.
Context Free Grammar S -> NP VP NP -> det (adj) N
Computational Grammars Azadeh Maghsoodi. History Before First 20s 20s World War II Last 1950s Nowadays.
Announcements Main CSE file server went down last night –Hand in your homework using ‘submit_cse467’ as soon as you can – no penalty if handed in today.
CS 4705 Lecture 11 Feature Structures and Unification Parsing.
Stochastic POS tagging Stochastic taggers choose tags that result in the highest probability: P(word | tag) * P(tag | previous n tags) Stochastic taggers.
11 CS 388: Natural Language Processing: Syntactic Parsing Raymond J. Mooney University of Texas at Austin.
Context Free Grammars Reading: Chap 12-13, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Paul Tarau, based on Rada.
1 Features and Unification Chapter 15 October 2012 Lecture #10.
1 CPE 480 Natural Language Processing Lecture 5: Parser Asst. Prof. Nuttanart Facundes, Ph.D.
CS 4705 Parsing More Efficiently and Accurately. Review Top-Down vs. Bottom-Up Parsers Left-corner table provides more efficient look- ahead Left recursion.
Chapter 16: Features and Unification Heshaam Faili University of Tehran.
GRAMMARS David Kauchak CS159 – Fall 2014 some slides adapted from Ray Mooney.
Chapter 10. Parsing with CFGs From: Chapter 10 of An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition, by.
Intro to NLP - J. Eisner1 Tree-Adjoining Grammar (TAG) One of several formalisms that are actually more powerful than CFG Note: CFG with features.
Finite State Parsing & Information Extraction CMSC Intro to NLP January 10, 2006.
10. Parsing with Context-free Grammars -Speech and Language Processing- 발표자 : 정영임 발표일 :
Fall 2004 Lecture Notes #4 EECS 595 / LING 541 / SI 661 Natural Language Processing.
NLP. Introduction to NLP Is language more than just a “bag of words”? Grammatical rules apply to categories and groups of words, not individual words.
Today Phrase structure rules, trees Constituents Recursion Conjunction
PARSING David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
Context Free Grammars Reading: Chap 9, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Rada Mihalcea.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Fall 2005-Lecture 4.
Parsing with Context-Free Grammars for ASR Julia Hirschberg CS 4706 Slides with contributions from Owen Rambow, Kathy McKeown, Dan Jurafsky and James Martin.
CPE 480 Natural Language Processing Lecture 4: Syntax Adapted from Owen Rambow’s slides for CSc Fall 2006.
The man bites the dog man bites the dog bites the dog the dog dog Parse Tree NP A N the man bites the dog V N NP S VP A 1. Sentence  noun-phrase verb-phrase.
CSA2050 Introduction to Computational Linguistics Parsing I.
NLP. Introduction to NLP Background –Developed by Jay Earley in 1970 –No need to convert the grammar to CNF –Left to right Complexity –Faster than O(n.
1 Context Free Grammars October Syntactic Grammaticality Doesn’t depend on Having heard the sentence before The sentence being true –I saw a unicorn.
Section 11.3 Features structures in the Grammar ─ Jin Wang.
CS 4705 Lecture 10 The Earley Algorithm. Review Top-Down vs. Bottom-Up Parsers –Both generate too many useless trees –Combine the two to avoid over-generation:
Supertagging CMSC Natural Language Processing January 31, 2006.
1 Natural Language Processing Lectures 8-9 Auxiliary Verbs Movement Phenomena Reading: James Allen NLU (Chapter 5)
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Spring 2006-Lecture 2.
1 Natural Language Processing Lecture 6 Features and Augmented Grammars Reading: James Allen NLU (Chapter 4)
NLP. Introduction to NLP (U)nderstanding and (G)eneration Language Computer (U) Language (G)
GRAMMARS David Kauchak CS457 – Spring 2011 some slides adapted from Ray Mooney.
CSA3050: NLP Algorithms Sentence Grammar NLP Algorithms.
Chapter 11: Parsing with Unification Grammars Heshaam Faili University of Tehran.
Natural Language Processing Vasile Rus
Natural Language Processing Vasile Rus
NLP.
Basic Parsing with Context Free Grammars Chapter 13
CSC NLP -Context-Free Grammars
Syntax.
CKY Parser 0Book 1 the 2 flight 3 through 4 Houston5 11/16/2018
CS 388: Natural Language Processing: Syntactic Parsing
Parsing and More Parsing
Introduction to Syntax
David Kauchak CS159 – Spring 2019
Parsing I: CFGs & the Earley Parser
Tree-Adjoining Grammar (TAG)
Artificial Intelligence 2004 Speech & Natural Language Processing
NLP.
Presentation transcript:

NLP

Introduction to NLP

Shallow parsing Useful for information extraction –noun phrases, verb phrases, locations, etc. Example –FASTUS (Appelt and Israel, 1997) Sample rules for noun groups: –NG  Pronoun | Time-NP | Date-NP –NG  (DETP) (Adjs) HdNns | DETP Ving HdNns –DETP  DETP-CP | DETP-CP

Incident: Date 19 Apr 89 Incident: Location El Salvador: San Salvador (CITY) Incident: Type Bombing Perpetrator: Individual ID "urban guerrillas" Perpetrator: Organization ID "FMLN" Perpetrator: Organization Suspected or Accused by Authorities: "FMLN" Confidence Physical Target: Description "vehicle" Physical Target: Effect Some Damage: "vehicle" Human Target: Name "Roberto Garcia Alvarado" Human Target: Description "attorney general": "Roberto Garcia Alvarado" "driver" "bodyguards" Human Target: Effect Death: "Roberto Garcia Alvarado" No Injury: "driver" Injury: "bodyguards"

Example –The dogs bites (agreement) Example –many water (count/mass nouns) Idea –S  NP VP (if the person of the NP is equal to the person of the VP)

Types of unification grammars –LFG, HPSG, FUG Handle agreement –e.g., number, gender, person Unification –Two constituents can be combined only if their features can unify

CAT NP PERSON 3 NUMBER SINGULAR CAT NP NUMBER SINGULAR PERSON 3 CAT NP NUMBER SINGULAR PERSON 3 U

CAT NP PERSON 3 NUMBER SINGULAR CAT NP PERSON 1 PERSON 3 U FAILURE

S  NP VP {NP PERSON} = {VP PERSON} S  Aux NP VP {Aux PERSON} = {NP PERSON} Verb  bites {Verb PERSON} = 3 Verb  bite {Verb PERSON} = 1

VP  Verb {VP SUBCAT} = {Verb SUBCAT} {VP SUBCAT} = INTRANS VP  Verb NP {VP SUBCAT} = {Verb SUBCAT} {VP SUBCAT} = TRANS VP  Verb NP NP {VP SUBCAT} = {Verb SUBCAT} {VP SUBCAT} = DITRANS

NLP

Introduction to NLP

Tree Substitution Grammar (TSG) –Terminals generate entire tree fragments –TSG and CFG are formally equivalent Tree Adjoining Grammar (TAG) Combinatory Categorial Grammar (CCG)

Like TSG but allow adjunction It can generate languages like a n b n c n or ww (cross-serial dependencies): –e.g., Mary gave a book and a magazine to Chen and Mike, respectively. Expressive power –TAG is formally more powerful than CFG –TAG is less powerful than CSG Card game online! – – VP

Complex types –E.g., X/Y and X\Y –These take an argument of type Y and return an object of type X. –X/Y – means that Y should appear on the right –X\Y – means that Y should appear on the left Expressive power –CCGs can generate the language a n b n c n d n, n>0 Example from Jonathan Kummerfeld, Aleka Blackwell, and Patrick Littell

Introduction to NLP

Associate a semantic expression with each node Javier eats pizza V: λ x,y eat(x,y)N: pizza VP: λx eat(x,pizza)N: Javier S: eat(Javier, pizza)

Introduction to NLP

Twodee (by Jason Eisner) – One, Two, Tree (by Noah Smith, Kevin Gimbel, and Jason Eisner) – CCG (by Jonathan Kummerfeld, Aleka Blackwell, and Patrick Littell) – Combining categories in Tok Pisin (same authors) – Grammar Rules (Andrea Schalley and Pat Littell) – Sk8 Parser (Pat Littell) –

NLP